Jan 31 00:34:50 np0005603500 kernel: Linux version 5.14.0-665.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Thu Jan 22 12:30:22 UTC 2026
Jan 31 00:34:50 np0005603500 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 31 00:34:50 np0005603500 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 31 00:34:50 np0005603500 kernel: BIOS-provided physical RAM map:
Jan 31 00:34:50 np0005603500 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 31 00:34:50 np0005603500 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 31 00:34:50 np0005603500 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 31 00:34:50 np0005603500 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 31 00:34:50 np0005603500 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 31 00:34:50 np0005603500 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 31 00:34:50 np0005603500 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 31 00:34:50 np0005603500 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 31 00:34:50 np0005603500 kernel: NX (Execute Disable) protection: active
Jan 31 00:34:50 np0005603500 kernel: APIC: Static calls initialized
Jan 31 00:34:50 np0005603500 kernel: SMBIOS 2.8 present.
Jan 31 00:34:50 np0005603500 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 31 00:34:50 np0005603500 kernel: Hypervisor detected: KVM
Jan 31 00:34:50 np0005603500 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 31 00:34:50 np0005603500 kernel: kvm-clock: using sched offset of 10450915702 cycles
Jan 31 00:34:50 np0005603500 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 31 00:34:50 np0005603500 kernel: tsc: Detected 2800.000 MHz processor
Jan 31 00:34:50 np0005603500 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 31 00:34:50 np0005603500 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 31 00:34:50 np0005603500 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 31 00:34:50 np0005603500 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 31 00:34:50 np0005603500 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 31 00:34:50 np0005603500 kernel: Using GB pages for direct mapping
Jan 31 00:34:50 np0005603500 kernel: RAMDISK: [mem 0x2d410000-0x329fffff]
Jan 31 00:34:50 np0005603500 kernel: ACPI: Early table checksum verification disabled
Jan 31 00:34:50 np0005603500 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 31 00:34:50 np0005603500 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 31 00:34:50 np0005603500 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 31 00:34:50 np0005603500 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 31 00:34:50 np0005603500 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 31 00:34:50 np0005603500 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 31 00:34:50 np0005603500 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 31 00:34:50 np0005603500 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 31 00:34:50 np0005603500 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 31 00:34:50 np0005603500 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 31 00:34:50 np0005603500 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 31 00:34:50 np0005603500 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 31 00:34:50 np0005603500 kernel: No NUMA configuration found
Jan 31 00:34:50 np0005603500 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 31 00:34:50 np0005603500 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Jan 31 00:34:50 np0005603500 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 31 00:34:50 np0005603500 kernel: Zone ranges:
Jan 31 00:34:50 np0005603500 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 31 00:34:50 np0005603500 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 31 00:34:50 np0005603500 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 31 00:34:50 np0005603500 kernel:  Device   empty
Jan 31 00:34:50 np0005603500 kernel: Movable zone start for each node
Jan 31 00:34:50 np0005603500 kernel: Early memory node ranges
Jan 31 00:34:50 np0005603500 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 31 00:34:50 np0005603500 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 31 00:34:50 np0005603500 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 31 00:34:50 np0005603500 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 31 00:34:50 np0005603500 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 31 00:34:50 np0005603500 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 31 00:34:50 np0005603500 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 31 00:34:50 np0005603500 kernel: ACPI: PM-Timer IO Port: 0x608
Jan 31 00:34:50 np0005603500 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 31 00:34:50 np0005603500 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 31 00:34:50 np0005603500 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 31 00:34:50 np0005603500 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 31 00:34:50 np0005603500 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 31 00:34:50 np0005603500 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 31 00:34:50 np0005603500 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 31 00:34:50 np0005603500 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 31 00:34:50 np0005603500 kernel: TSC deadline timer available
Jan 31 00:34:50 np0005603500 kernel: CPU topo: Max. logical packages:   8
Jan 31 00:34:50 np0005603500 kernel: CPU topo: Max. logical dies:       8
Jan 31 00:34:50 np0005603500 kernel: CPU topo: Max. dies per package:   1
Jan 31 00:34:50 np0005603500 kernel: CPU topo: Max. threads per core:   1
Jan 31 00:34:50 np0005603500 kernel: CPU topo: Num. cores per package:     1
Jan 31 00:34:50 np0005603500 kernel: CPU topo: Num. threads per package:   1
Jan 31 00:34:50 np0005603500 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 31 00:34:50 np0005603500 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 31 00:34:50 np0005603500 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 31 00:34:50 np0005603500 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 31 00:34:50 np0005603500 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 31 00:34:50 np0005603500 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 31 00:34:50 np0005603500 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 31 00:34:50 np0005603500 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 31 00:34:50 np0005603500 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 31 00:34:50 np0005603500 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 31 00:34:50 np0005603500 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 31 00:34:50 np0005603500 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 31 00:34:50 np0005603500 kernel: Booting paravirtualized kernel on KVM
Jan 31 00:34:50 np0005603500 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 31 00:34:50 np0005603500 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 31 00:34:50 np0005603500 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 31 00:34:50 np0005603500 kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 31 00:34:50 np0005603500 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 31 00:34:50 np0005603500 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64", will be passed to user space.
Jan 31 00:34:50 np0005603500 kernel: random: crng init done
Jan 31 00:34:50 np0005603500 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 31 00:34:50 np0005603500 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 31 00:34:50 np0005603500 kernel: Fallback order for Node 0: 0 
Jan 31 00:34:50 np0005603500 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 31 00:34:50 np0005603500 kernel: Policy zone: Normal
Jan 31 00:34:50 np0005603500 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 31 00:34:50 np0005603500 kernel: software IO TLB: area num 8.
Jan 31 00:34:50 np0005603500 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 31 00:34:50 np0005603500 kernel: ftrace: allocating 49438 entries in 194 pages
Jan 31 00:34:50 np0005603500 kernel: ftrace: allocated 194 pages with 3 groups
Jan 31 00:34:50 np0005603500 kernel: Dynamic Preempt: voluntary
Jan 31 00:34:50 np0005603500 kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 31 00:34:50 np0005603500 kernel: rcu: #011RCU event tracing is enabled.
Jan 31 00:34:50 np0005603500 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 31 00:34:50 np0005603500 kernel: #011Trampoline variant of Tasks RCU enabled.
Jan 31 00:34:50 np0005603500 kernel: #011Rude variant of Tasks RCU enabled.
Jan 31 00:34:50 np0005603500 kernel: #011Tracing variant of Tasks RCU enabled.
Jan 31 00:34:50 np0005603500 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 31 00:34:50 np0005603500 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 31 00:34:50 np0005603500 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 31 00:34:50 np0005603500 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 31 00:34:50 np0005603500 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 31 00:34:50 np0005603500 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 31 00:34:50 np0005603500 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 31 00:34:50 np0005603500 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 31 00:34:50 np0005603500 kernel: Console: colour VGA+ 80x25
Jan 31 00:34:50 np0005603500 kernel: printk: console [ttyS0] enabled
Jan 31 00:34:50 np0005603500 kernel: ACPI: Core revision 20230331
Jan 31 00:34:50 np0005603500 kernel: APIC: Switch to symmetric I/O mode setup
Jan 31 00:34:50 np0005603500 kernel: x2apic enabled
Jan 31 00:34:50 np0005603500 kernel: APIC: Switched APIC routing to: physical x2apic
Jan 31 00:34:50 np0005603500 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 31 00:34:50 np0005603500 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Jan 31 00:34:50 np0005603500 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 31 00:34:50 np0005603500 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 31 00:34:50 np0005603500 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 31 00:34:50 np0005603500 kernel: mitigations: Enabled attack vectors: user_kernel, user_user, guest_host, guest_guest, SMT mitigations: auto
Jan 31 00:34:50 np0005603500 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 31 00:34:50 np0005603500 kernel: Spectre V2 : Mitigation: Retpolines
Jan 31 00:34:50 np0005603500 kernel: RETBleed: Mitigation: untrained return thunk
Jan 31 00:34:50 np0005603500 kernel: Speculative Return Stack Overflow: Mitigation: SMT disabled
Jan 31 00:34:50 np0005603500 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 31 00:34:50 np0005603500 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 31 00:34:50 np0005603500 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 31 00:34:50 np0005603500 kernel: active return thunk: retbleed_return_thunk
Jan 31 00:34:50 np0005603500 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 31 00:34:50 np0005603500 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 31 00:34:50 np0005603500 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 31 00:34:50 np0005603500 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 31 00:34:50 np0005603500 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 31 00:34:50 np0005603500 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 31 00:34:50 np0005603500 kernel: Freeing SMP alternatives memory: 40K
Jan 31 00:34:50 np0005603500 kernel: pid_max: default: 32768 minimum: 301
Jan 31 00:34:50 np0005603500 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 31 00:34:50 np0005603500 kernel: landlock: Up and running.
Jan 31 00:34:50 np0005603500 kernel: Yama: becoming mindful.
Jan 31 00:34:50 np0005603500 kernel: SELinux:  Initializing.
Jan 31 00:34:50 np0005603500 kernel: LSM support for eBPF active
Jan 31 00:34:50 np0005603500 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 31 00:34:50 np0005603500 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 31 00:34:50 np0005603500 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 31 00:34:50 np0005603500 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 31 00:34:50 np0005603500 kernel: ... version:                0
Jan 31 00:34:50 np0005603500 kernel: ... bit width:              48
Jan 31 00:34:50 np0005603500 kernel: ... generic registers:      6
Jan 31 00:34:50 np0005603500 kernel: ... value mask:             0000ffffffffffff
Jan 31 00:34:50 np0005603500 kernel: ... max period:             00007fffffffffff
Jan 31 00:34:50 np0005603500 kernel: ... fixed-purpose events:   0
Jan 31 00:34:50 np0005603500 kernel: ... event mask:             000000000000003f
Jan 31 00:34:50 np0005603500 kernel: signal: max sigframe size: 1776
Jan 31 00:34:50 np0005603500 kernel: rcu: Hierarchical SRCU implementation.
Jan 31 00:34:50 np0005603500 kernel: rcu: #011Max phase no-delay instances is 400.
Jan 31 00:34:50 np0005603500 kernel: smp: Bringing up secondary CPUs ...
Jan 31 00:34:50 np0005603500 kernel: smpboot: x86: Booting SMP configuration:
Jan 31 00:34:50 np0005603500 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 31 00:34:50 np0005603500 kernel: smp: Brought up 1 node, 8 CPUs
Jan 31 00:34:50 np0005603500 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Jan 31 00:34:50 np0005603500 kernel: node 0 deferred pages initialised in 10ms
Jan 31 00:34:50 np0005603500 kernel: Memory: 7763776K/8388068K available (16384K kernel code, 5801K rwdata, 13928K rodata, 4196K init, 7192K bss, 618400K reserved, 0K cma-reserved)
Jan 31 00:34:50 np0005603500 kernel: devtmpfs: initialized
Jan 31 00:34:50 np0005603500 kernel: x86/mm: Memory block size: 128MB
Jan 31 00:34:50 np0005603500 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 31 00:34:50 np0005603500 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 31 00:34:50 np0005603500 kernel: pinctrl core: initialized pinctrl subsystem
Jan 31 00:34:50 np0005603500 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 31 00:34:50 np0005603500 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 31 00:34:50 np0005603500 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 31 00:34:50 np0005603500 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 31 00:34:50 np0005603500 kernel: audit: initializing netlink subsys (disabled)
Jan 31 00:34:50 np0005603500 kernel: audit: type=2000 audit(1769837688.630:1): state=initialized audit_enabled=0 res=1
Jan 31 00:34:50 np0005603500 kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 31 00:34:50 np0005603500 kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 31 00:34:50 np0005603500 kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 31 00:34:50 np0005603500 kernel: cpuidle: using governor menu
Jan 31 00:34:50 np0005603500 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 31 00:34:50 np0005603500 kernel: PCI: Using configuration type 1 for base access
Jan 31 00:34:50 np0005603500 kernel: PCI: Using configuration type 1 for extended access
Jan 31 00:34:50 np0005603500 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 31 00:34:50 np0005603500 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 31 00:34:50 np0005603500 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 31 00:34:50 np0005603500 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 31 00:34:50 np0005603500 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 31 00:34:50 np0005603500 kernel: Demotion targets for Node 0: null
Jan 31 00:34:50 np0005603500 kernel: cryptd: max_cpu_qlen set to 1000
Jan 31 00:34:50 np0005603500 kernel: ACPI: Added _OSI(Module Device)
Jan 31 00:34:50 np0005603500 kernel: ACPI: Added _OSI(Processor Device)
Jan 31 00:34:50 np0005603500 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 31 00:34:50 np0005603500 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 31 00:34:50 np0005603500 kernel: ACPI: Interpreter enabled
Jan 31 00:34:50 np0005603500 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 31 00:34:50 np0005603500 kernel: ACPI: Using IOAPIC for interrupt routing
Jan 31 00:34:50 np0005603500 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 31 00:34:50 np0005603500 kernel: PCI: Using E820 reservations for host bridge windows
Jan 31 00:34:50 np0005603500 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 31 00:34:50 np0005603500 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 31 00:34:50 np0005603500 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [3] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [4] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [5] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [6] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [7] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [8] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [9] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [10] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [11] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [12] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [13] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [14] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [15] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [16] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [17] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [18] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [19] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [20] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [21] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [22] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [23] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [24] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [25] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [26] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [27] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [28] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [29] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [30] registered
Jan 31 00:34:50 np0005603500 kernel: acpiphp: Slot [31] registered
Jan 31 00:34:50 np0005603500 kernel: PCI host bridge to bus 0000:00
Jan 31 00:34:50 np0005603500 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 31 00:34:50 np0005603500 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 31 00:34:50 np0005603500 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 31 00:34:50 np0005603500 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 31 00:34:50 np0005603500 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 31 00:34:50 np0005603500 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 31 00:34:50 np0005603500 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 31 00:34:50 np0005603500 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 31 00:34:50 np0005603500 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 31 00:34:50 np0005603500 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 31 00:34:50 np0005603500 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 31 00:34:50 np0005603500 kernel: iommu: Default domain type: Translated
Jan 31 00:34:50 np0005603500 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 31 00:34:50 np0005603500 kernel: SCSI subsystem initialized
Jan 31 00:34:50 np0005603500 kernel: ACPI: bus type USB registered
Jan 31 00:34:50 np0005603500 kernel: usbcore: registered new interface driver usbfs
Jan 31 00:34:50 np0005603500 kernel: usbcore: registered new interface driver hub
Jan 31 00:34:50 np0005603500 kernel: usbcore: registered new device driver usb
Jan 31 00:34:50 np0005603500 kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 31 00:34:50 np0005603500 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 31 00:34:50 np0005603500 kernel: PTP clock support registered
Jan 31 00:34:50 np0005603500 kernel: EDAC MC: Ver: 3.0.0
Jan 31 00:34:50 np0005603500 kernel: NetLabel: Initializing
Jan 31 00:34:50 np0005603500 kernel: NetLabel:  domain hash size = 128
Jan 31 00:34:50 np0005603500 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 31 00:34:50 np0005603500 kernel: NetLabel:  unlabeled traffic allowed by default
Jan 31 00:34:50 np0005603500 kernel: PCI: Using ACPI for IRQ routing
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 31 00:34:50 np0005603500 kernel: vgaarb: loaded
Jan 31 00:34:50 np0005603500 kernel: clocksource: Switched to clocksource kvm-clock
Jan 31 00:34:50 np0005603500 kernel: VFS: Disk quotas dquot_6.6.0
Jan 31 00:34:50 np0005603500 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 31 00:34:50 np0005603500 kernel: pnp: PnP ACPI init
Jan 31 00:34:50 np0005603500 kernel: pnp: PnP ACPI: found 5 devices
Jan 31 00:34:50 np0005603500 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 31 00:34:50 np0005603500 kernel: NET: Registered PF_INET protocol family
Jan 31 00:34:50 np0005603500 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 31 00:34:50 np0005603500 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 31 00:34:50 np0005603500 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 31 00:34:50 np0005603500 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 31 00:34:50 np0005603500 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 31 00:34:50 np0005603500 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 31 00:34:50 np0005603500 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 31 00:34:50 np0005603500 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 31 00:34:50 np0005603500 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 31 00:34:50 np0005603500 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 31 00:34:50 np0005603500 kernel: NET: Registered PF_XDP protocol family
Jan 31 00:34:50 np0005603500 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 31 00:34:50 np0005603500 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 31 00:34:50 np0005603500 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 31 00:34:50 np0005603500 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 31 00:34:50 np0005603500 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 31 00:34:50 np0005603500 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 31 00:34:50 np0005603500 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 38705 usecs
Jan 31 00:34:50 np0005603500 kernel: PCI: CLS 0 bytes, default 64
Jan 31 00:34:50 np0005603500 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 31 00:34:50 np0005603500 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 31 00:34:50 np0005603500 kernel: ACPI: bus type thunderbolt registered
Jan 31 00:34:50 np0005603500 kernel: Trying to unpack rootfs image as initramfs...
Jan 31 00:34:50 np0005603500 kernel: Initialise system trusted keyrings
Jan 31 00:34:50 np0005603500 kernel: Key type blacklist registered
Jan 31 00:34:50 np0005603500 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 31 00:34:50 np0005603500 kernel: zbud: loaded
Jan 31 00:34:50 np0005603500 kernel: integrity: Platform Keyring initialized
Jan 31 00:34:50 np0005603500 kernel: integrity: Machine keyring initialized
Jan 31 00:34:50 np0005603500 kernel: Freeing initrd memory: 88000K
Jan 31 00:34:50 np0005603500 kernel: NET: Registered PF_ALG protocol family
Jan 31 00:34:50 np0005603500 kernel: xor: automatically using best checksumming function   avx       
Jan 31 00:34:50 np0005603500 kernel: Key type asymmetric registered
Jan 31 00:34:50 np0005603500 kernel: Asymmetric key parser 'x509' registered
Jan 31 00:34:50 np0005603500 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 31 00:34:50 np0005603500 kernel: io scheduler mq-deadline registered
Jan 31 00:34:50 np0005603500 kernel: io scheduler kyber registered
Jan 31 00:34:50 np0005603500 kernel: io scheduler bfq registered
Jan 31 00:34:50 np0005603500 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 31 00:34:50 np0005603500 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 31 00:34:50 np0005603500 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 31 00:34:50 np0005603500 kernel: ACPI: button: Power Button [PWRF]
Jan 31 00:34:50 np0005603500 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 31 00:34:50 np0005603500 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 31 00:34:50 np0005603500 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 31 00:34:50 np0005603500 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 31 00:34:50 np0005603500 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 31 00:34:50 np0005603500 kernel: Non-volatile memory driver v1.3
Jan 31 00:34:50 np0005603500 kernel: rdac: device handler registered
Jan 31 00:34:50 np0005603500 kernel: hp_sw: device handler registered
Jan 31 00:34:50 np0005603500 kernel: emc: device handler registered
Jan 31 00:34:50 np0005603500 kernel: alua: device handler registered
Jan 31 00:34:50 np0005603500 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 31 00:34:50 np0005603500 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 31 00:34:50 np0005603500 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 31 00:34:50 np0005603500 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 31 00:34:50 np0005603500 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 31 00:34:50 np0005603500 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 31 00:34:50 np0005603500 kernel: usb usb1: Product: UHCI Host Controller
Jan 31 00:34:50 np0005603500 kernel: usb usb1: Manufacturer: Linux 5.14.0-665.el9.x86_64 uhci_hcd
Jan 31 00:34:50 np0005603500 kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 31 00:34:50 np0005603500 kernel: hub 1-0:1.0: USB hub found
Jan 31 00:34:50 np0005603500 kernel: hub 1-0:1.0: 2 ports detected
Jan 31 00:34:50 np0005603500 kernel: usbcore: registered new interface driver usbserial_generic
Jan 31 00:34:50 np0005603500 kernel: usbserial: USB Serial support registered for generic
Jan 31 00:34:50 np0005603500 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 31 00:34:50 np0005603500 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 31 00:34:50 np0005603500 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 31 00:34:50 np0005603500 kernel: mousedev: PS/2 mouse device common for all mice
Jan 31 00:34:50 np0005603500 kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 31 00:34:50 np0005603500 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 31 00:34:50 np0005603500 kernel: rtc_cmos 00:04: registered as rtc0
Jan 31 00:34:50 np0005603500 kernel: rtc_cmos 00:04: setting system clock to 2026-01-31T05:34:49 UTC (1769837689)
Jan 31 00:34:50 np0005603500 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 31 00:34:50 np0005603500 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 31 00:34:50 np0005603500 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 31 00:34:50 np0005603500 kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 31 00:34:50 np0005603500 kernel: usbcore: registered new interface driver usbhid
Jan 31 00:34:50 np0005603500 kernel: usbhid: USB HID core driver
Jan 31 00:34:50 np0005603500 kernel: drop_monitor: Initializing network drop monitor service
Jan 31 00:34:50 np0005603500 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 31 00:34:50 np0005603500 kernel: Initializing XFRM netlink socket
Jan 31 00:34:50 np0005603500 kernel: NET: Registered PF_INET6 protocol family
Jan 31 00:34:50 np0005603500 kernel: Segment Routing with IPv6
Jan 31 00:34:50 np0005603500 kernel: NET: Registered PF_PACKET protocol family
Jan 31 00:34:50 np0005603500 kernel: mpls_gso: MPLS GSO support
Jan 31 00:34:50 np0005603500 kernel: IPI shorthand broadcast: enabled
Jan 31 00:34:50 np0005603500 kernel: AVX2 version of gcm_enc/dec engaged.
Jan 31 00:34:50 np0005603500 kernel: AES CTR mode by8 optimization enabled
Jan 31 00:34:50 np0005603500 kernel: sched_clock: Marking stable (900003038, 139170181)->(1158492946, -119319727)
Jan 31 00:34:50 np0005603500 kernel: registered taskstats version 1
Jan 31 00:34:50 np0005603500 kernel: Loading compiled-in X.509 certificates
Jan 31 00:34:50 np0005603500 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8d408fd8f954b245ea1a4231fd25ac56c328a9b5'
Jan 31 00:34:50 np0005603500 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 31 00:34:50 np0005603500 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 31 00:34:50 np0005603500 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 31 00:34:50 np0005603500 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 31 00:34:50 np0005603500 kernel: Demotion targets for Node 0: null
Jan 31 00:34:50 np0005603500 kernel: page_owner is disabled
Jan 31 00:34:50 np0005603500 kernel: Key type .fscrypt registered
Jan 31 00:34:50 np0005603500 kernel: Key type fscrypt-provisioning registered
Jan 31 00:34:50 np0005603500 kernel: Key type big_key registered
Jan 31 00:34:50 np0005603500 kernel: Key type encrypted registered
Jan 31 00:34:50 np0005603500 kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 31 00:34:50 np0005603500 kernel: Loading compiled-in module X.509 certificates
Jan 31 00:34:50 np0005603500 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8d408fd8f954b245ea1a4231fd25ac56c328a9b5'
Jan 31 00:34:50 np0005603500 kernel: ima: Allocated hash algorithm: sha256
Jan 31 00:34:50 np0005603500 kernel: ima: No architecture policies found
Jan 31 00:34:50 np0005603500 kernel: evm: Initialising EVM extended attributes:
Jan 31 00:34:50 np0005603500 kernel: evm: security.selinux
Jan 31 00:34:50 np0005603500 kernel: evm: security.SMACK64 (disabled)
Jan 31 00:34:50 np0005603500 kernel: evm: security.SMACK64EXEC (disabled)
Jan 31 00:34:50 np0005603500 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 31 00:34:50 np0005603500 kernel: evm: security.SMACK64MMAP (disabled)
Jan 31 00:34:50 np0005603500 kernel: evm: security.apparmor (disabled)
Jan 31 00:34:50 np0005603500 kernel: evm: security.ima
Jan 31 00:34:50 np0005603500 kernel: evm: security.capability
Jan 31 00:34:50 np0005603500 kernel: evm: HMAC attrs: 0x1
Jan 31 00:34:50 np0005603500 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 31 00:34:50 np0005603500 kernel: Running certificate verification RSA selftest
Jan 31 00:34:50 np0005603500 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 31 00:34:50 np0005603500 kernel: Running certificate verification ECDSA selftest
Jan 31 00:34:50 np0005603500 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 31 00:34:50 np0005603500 kernel: clk: Disabling unused clocks
Jan 31 00:34:50 np0005603500 kernel: Freeing unused decrypted memory: 2028K
Jan 31 00:34:50 np0005603500 kernel: Freeing unused kernel image (initmem) memory: 4196K
Jan 31 00:34:50 np0005603500 kernel: Write protecting the kernel read-only data: 30720k
Jan 31 00:34:50 np0005603500 kernel: Freeing unused kernel image (rodata/data gap) memory: 408K
Jan 31 00:34:50 np0005603500 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 31 00:34:50 np0005603500 kernel: Run /init as init process
Jan 31 00:34:50 np0005603500 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 31 00:34:50 np0005603500 systemd: Detected virtualization kvm.
Jan 31 00:34:50 np0005603500 systemd: Detected architecture x86-64.
Jan 31 00:34:50 np0005603500 systemd: Running in initrd.
Jan 31 00:34:50 np0005603500 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 31 00:34:50 np0005603500 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 31 00:34:50 np0005603500 kernel: usb 1-1: Product: QEMU USB Tablet
Jan 31 00:34:50 np0005603500 kernel: usb 1-1: Manufacturer: QEMU
Jan 31 00:34:50 np0005603500 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 31 00:34:50 np0005603500 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 31 00:34:50 np0005603500 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 31 00:34:50 np0005603500 systemd: No hostname configured, using default hostname.
Jan 31 00:34:50 np0005603500 systemd: Hostname set to <localhost>.
Jan 31 00:34:50 np0005603500 systemd: Initializing machine ID from VM UUID.
Jan 31 00:34:50 np0005603500 systemd: Queued start job for default target Initrd Default Target.
Jan 31 00:34:50 np0005603500 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 31 00:34:50 np0005603500 systemd: Reached target Local Encrypted Volumes.
Jan 31 00:34:50 np0005603500 systemd: Reached target Initrd /usr File System.
Jan 31 00:34:50 np0005603500 systemd: Reached target Local File Systems.
Jan 31 00:34:50 np0005603500 systemd: Reached target Path Units.
Jan 31 00:34:50 np0005603500 systemd: Reached target Slice Units.
Jan 31 00:34:50 np0005603500 systemd: Reached target Swaps.
Jan 31 00:34:50 np0005603500 systemd: Reached target Timer Units.
Jan 31 00:34:50 np0005603500 systemd: Listening on D-Bus System Message Bus Socket.
Jan 31 00:34:50 np0005603500 systemd: Listening on Journal Socket (/dev/log).
Jan 31 00:34:50 np0005603500 systemd: Listening on Journal Socket.
Jan 31 00:34:50 np0005603500 systemd: Listening on udev Control Socket.
Jan 31 00:34:50 np0005603500 systemd: Listening on udev Kernel Socket.
Jan 31 00:34:50 np0005603500 systemd: Reached target Socket Units.
Jan 31 00:34:50 np0005603500 systemd: Starting Create List of Static Device Nodes...
Jan 31 00:34:50 np0005603500 systemd: Starting Journal Service...
Jan 31 00:34:50 np0005603500 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 31 00:34:50 np0005603500 systemd: Starting Apply Kernel Variables...
Jan 31 00:34:50 np0005603500 systemd: Starting Create System Users...
Jan 31 00:34:50 np0005603500 systemd: Starting Setup Virtual Console...
Jan 31 00:34:50 np0005603500 systemd: Finished Create List of Static Device Nodes.
Jan 31 00:34:50 np0005603500 systemd: Finished Apply Kernel Variables.
Jan 31 00:34:50 np0005603500 systemd: Finished Create System Users.
Jan 31 00:34:50 np0005603500 systemd-journald[304]: Journal started
Jan 31 00:34:50 np0005603500 systemd-journald[304]: Runtime Journal (/run/log/journal/e984390d4171477aad02535ae0fc7a74) is 8.0M, max 153.6M, 145.6M free.
Jan 31 00:34:50 np0005603500 systemd-sysusers[309]: Creating group 'users' with GID 100.
Jan 31 00:34:50 np0005603500 systemd-sysusers[309]: Creating group 'dbus' with GID 81.
Jan 31 00:34:50 np0005603500 systemd-sysusers[309]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 31 00:34:50 np0005603500 systemd: Started Journal Service.
Jan 31 00:34:50 np0005603500 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 31 00:34:50 np0005603500 systemd[1]: Starting Create Volatile Files and Directories...
Jan 31 00:34:50 np0005603500 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 31 00:34:50 np0005603500 systemd[1]: Finished Create Volatile Files and Directories.
Jan 31 00:34:50 np0005603500 systemd[1]: Finished Setup Virtual Console.
Jan 31 00:34:50 np0005603500 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 31 00:34:50 np0005603500 systemd[1]: Starting dracut cmdline hook...
Jan 31 00:34:50 np0005603500 dracut-cmdline[324]: dracut-9 dracut-057-102.git20250818.el9
Jan 31 00:34:50 np0005603500 dracut-cmdline[324]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 31 00:34:50 np0005603500 systemd[1]: Finished dracut cmdline hook.
Jan 31 00:34:50 np0005603500 systemd[1]: Starting dracut pre-udev hook...
Jan 31 00:34:50 np0005603500 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 31 00:34:50 np0005603500 kernel: device-mapper: uevent: version 1.0.3
Jan 31 00:34:50 np0005603500 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 31 00:34:51 np0005603500 kernel: RPC: Registered named UNIX socket transport module.
Jan 31 00:34:51 np0005603500 kernel: RPC: Registered udp transport module.
Jan 31 00:34:51 np0005603500 kernel: RPC: Registered tcp transport module.
Jan 31 00:34:51 np0005603500 kernel: RPC: Registered tcp-with-tls transport module.
Jan 31 00:34:51 np0005603500 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 31 00:34:51 np0005603500 rpc.statd[440]: Version 2.5.4 starting
Jan 31 00:34:51 np0005603500 rpc.statd[440]: Initializing NSM state
Jan 31 00:34:51 np0005603500 rpc.idmapd[445]: Setting log level to 0
Jan 31 00:34:51 np0005603500 systemd[1]: Finished dracut pre-udev hook.
Jan 31 00:34:51 np0005603500 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 31 00:34:51 np0005603500 systemd-udevd[458]: Using default interface naming scheme 'rhel-9.0'.
Jan 31 00:34:51 np0005603500 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 31 00:34:51 np0005603500 systemd[1]: Starting dracut pre-trigger hook...
Jan 31 00:34:51 np0005603500 systemd[1]: Finished dracut pre-trigger hook.
Jan 31 00:34:51 np0005603500 systemd[1]: Starting Coldplug All udev Devices...
Jan 31 00:34:51 np0005603500 systemd[1]: Created slice Slice /system/modprobe.
Jan 31 00:34:51 np0005603500 systemd[1]: Starting Load Kernel Module configfs...
Jan 31 00:34:51 np0005603500 systemd[1]: Finished Coldplug All udev Devices.
Jan 31 00:34:51 np0005603500 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 31 00:34:51 np0005603500 systemd[1]: Finished Load Kernel Module configfs.
Jan 31 00:34:51 np0005603500 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 31 00:34:51 np0005603500 systemd[1]: Reached target Network.
Jan 31 00:34:51 np0005603500 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 31 00:34:51 np0005603500 systemd[1]: Starting dracut initqueue hook...
Jan 31 00:34:51 np0005603500 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 31 00:34:51 np0005603500 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 31 00:34:51 np0005603500 kernel: vda: vda1
Jan 31 00:34:51 np0005603500 kernel: scsi host0: ata_piix
Jan 31 00:34:51 np0005603500 kernel: scsi host1: ata_piix
Jan 31 00:34:51 np0005603500 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 31 00:34:51 np0005603500 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 31 00:34:51 np0005603500 systemd[1]: Found device /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8.
Jan 31 00:34:51 np0005603500 systemd[1]: Reached target Initrd Root Device.
Jan 31 00:34:51 np0005603500 kernel: ata1: found unknown device (class 0)
Jan 31 00:34:51 np0005603500 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 31 00:34:51 np0005603500 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 31 00:34:51 np0005603500 systemd[1]: Mounting Kernel Configuration File System...
Jan 31 00:34:51 np0005603500 systemd[1]: Mounted Kernel Configuration File System.
Jan 31 00:34:51 np0005603500 systemd[1]: Reached target System Initialization.
Jan 31 00:34:51 np0005603500 systemd[1]: Reached target Basic System.
Jan 31 00:34:51 np0005603500 systemd-udevd[472]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 00:34:51 np0005603500 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 31 00:34:51 np0005603500 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 31 00:34:51 np0005603500 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 31 00:34:51 np0005603500 systemd[1]: Finished dracut initqueue hook.
Jan 31 00:34:51 np0005603500 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 31 00:34:51 np0005603500 systemd[1]: Reached target Remote Encrypted Volumes.
Jan 31 00:34:51 np0005603500 systemd[1]: Reached target Remote File Systems.
Jan 31 00:34:51 np0005603500 systemd[1]: Starting dracut pre-mount hook...
Jan 31 00:34:51 np0005603500 systemd[1]: Finished dracut pre-mount hook.
Jan 31 00:34:51 np0005603500 systemd[1]: Starting File System Check on /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8...
Jan 31 00:34:51 np0005603500 systemd-fsck[555]: /usr/sbin/fsck.xfs: XFS file system.
Jan 31 00:34:51 np0005603500 systemd[1]: Finished File System Check on /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8.
Jan 31 00:34:51 np0005603500 systemd[1]: Mounting /sysroot...
Jan 31 00:34:52 np0005603500 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 31 00:34:52 np0005603500 kernel: XFS (vda1): Mounting V5 Filesystem 822f14ea-6e7e-41df-b0d8-fbe282d9ded8
Jan 31 00:34:53 np0005603500 kernel: XFS (vda1): Ending clean mount
Jan 31 00:34:53 np0005603500 systemd[1]: Mounted /sysroot.
Jan 31 00:34:53 np0005603500 systemd[1]: Reached target Initrd Root File System.
Jan 31 00:34:53 np0005603500 systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 31 00:34:53 np0005603500 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 31 00:34:53 np0005603500 systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 31 00:34:53 np0005603500 systemd[1]: Reached target Initrd File Systems.
Jan 31 00:34:53 np0005603500 systemd[1]: Reached target Initrd Default Target.
Jan 31 00:34:53 np0005603500 systemd[1]: Starting dracut mount hook...
Jan 31 00:34:53 np0005603500 systemd[1]: Finished dracut mount hook.
Jan 31 00:34:53 np0005603500 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 31 00:34:53 np0005603500 rpc.idmapd[445]: exiting on signal 15
Jan 31 00:34:53 np0005603500 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 31 00:34:53 np0005603500 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 31 00:34:53 np0005603500 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 31 00:34:53 np0005603500 systemd[1]: Stopped target Network.
Jan 31 00:34:53 np0005603500 systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 31 00:34:53 np0005603500 systemd[1]: Stopped target Timer Units.
Jan 31 00:34:53 np0005603500 systemd[1]: dbus.socket: Deactivated successfully.
Jan 31 00:34:53 np0005603500 systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 31 00:34:53 np0005603500 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 31 00:34:53 np0005603500 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 31 00:34:53 np0005603500 systemd[1]: Stopped target Initrd Default Target.
Jan 31 00:34:53 np0005603500 systemd[1]: Stopped target Basic System.
Jan 31 00:34:53 np0005603500 systemd[1]: Stopped target Initrd Root Device.
Jan 31 00:34:53 np0005603500 systemd[1]: Stopped target Initrd /usr File System.
Jan 31 00:34:53 np0005603500 systemd[1]: Stopped target Path Units.
Jan 31 00:34:53 np0005603500 systemd[1]: Stopped target Remote File Systems.
Jan 31 00:34:53 np0005603500 systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 31 00:34:53 np0005603500 systemd[1]: Stopped target Slice Units.
Jan 31 00:34:53 np0005603500 systemd[1]: Stopped target Socket Units.
Jan 31 00:34:53 np0005603500 systemd[1]: Stopped target System Initialization.
Jan 31 00:34:53 np0005603500 systemd[1]: Stopped target Local File Systems.
Jan 31 00:34:53 np0005603500 systemd[1]: Stopped target Swaps.
Jan 31 00:34:53 np0005603500 systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 31 00:34:53 np0005603500 systemd[1]: Stopped dracut mount hook.
Jan 31 00:34:53 np0005603500 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 31 00:34:53 np0005603500 systemd[1]: Stopped dracut pre-mount hook.
Jan 31 00:34:53 np0005603500 systemd[1]: Stopped target Local Encrypted Volumes.
Jan 31 00:34:54 np0005603500 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 31 00:34:54 np0005603500 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: Stopped dracut initqueue hook.
Jan 31 00:34:54 np0005603500 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: Stopped Apply Kernel Variables.
Jan 31 00:34:54 np0005603500 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: Stopped Create Volatile Files and Directories.
Jan 31 00:34:54 np0005603500 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: Stopped Coldplug All udev Devices.
Jan 31 00:34:54 np0005603500 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: Stopped dracut pre-trigger hook.
Jan 31 00:34:54 np0005603500 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 31 00:34:54 np0005603500 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: Stopped Setup Virtual Console.
Jan 31 00:34:54 np0005603500 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 31 00:34:54 np0005603500 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 31 00:34:54 np0005603500 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: Closed udev Control Socket.
Jan 31 00:34:54 np0005603500 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: Closed udev Kernel Socket.
Jan 31 00:34:54 np0005603500 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: Stopped dracut pre-udev hook.
Jan 31 00:34:54 np0005603500 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: Stopped dracut cmdline hook.
Jan 31 00:34:54 np0005603500 systemd[1]: Starting Cleanup udev Database...
Jan 31 00:34:54 np0005603500 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 31 00:34:54 np0005603500 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: Stopped Create List of Static Device Nodes.
Jan 31 00:34:54 np0005603500 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: Stopped Create System Users.
Jan 31 00:34:54 np0005603500 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 31 00:34:54 np0005603500 systemd[1]: Finished Cleanup udev Database.
Jan 31 00:34:54 np0005603500 systemd[1]: Reached target Switch Root.
Jan 31 00:34:54 np0005603500 systemd[1]: Starting Switch Root...
Jan 31 00:34:54 np0005603500 systemd[1]: Switching root.
Jan 31 00:34:54 np0005603500 systemd-journald[304]: Journal stopped
Jan 31 00:34:55 np0005603500 systemd-journald: Received SIGTERM from PID 1 (systemd).
Jan 31 00:34:55 np0005603500 kernel: audit: type=1404 audit(1769837694.283:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 31 00:34:55 np0005603500 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 00:34:55 np0005603500 kernel: SELinux:  policy capability open_perms=1
Jan 31 00:34:55 np0005603500 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 00:34:55 np0005603500 kernel: SELinux:  policy capability always_check_network=0
Jan 31 00:34:55 np0005603500 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 00:34:55 np0005603500 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 00:34:55 np0005603500 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 00:34:55 np0005603500 kernel: audit: type=1403 audit(1769837694.421:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 31 00:34:55 np0005603500 systemd: Successfully loaded SELinux policy in 143.987ms.
Jan 31 00:34:55 np0005603500 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 31.081ms.
Jan 31 00:34:55 np0005603500 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 31 00:34:55 np0005603500 systemd: Detected virtualization kvm.
Jan 31 00:34:55 np0005603500 systemd: Detected architecture x86-64.
Jan 31 00:34:55 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 00:34:55 np0005603500 systemd: initrd-switch-root.service: Deactivated successfully.
Jan 31 00:34:55 np0005603500 systemd: Stopped Switch Root.
Jan 31 00:34:55 np0005603500 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 31 00:34:55 np0005603500 systemd: Created slice Slice /system/getty.
Jan 31 00:34:55 np0005603500 systemd: Created slice Slice /system/serial-getty.
Jan 31 00:34:55 np0005603500 systemd: Created slice Slice /system/sshd-keygen.
Jan 31 00:34:55 np0005603500 systemd: Created slice User and Session Slice.
Jan 31 00:34:55 np0005603500 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 31 00:34:55 np0005603500 systemd: Started Forward Password Requests to Wall Directory Watch.
Jan 31 00:34:55 np0005603500 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 31 00:34:55 np0005603500 systemd: Reached target Local Encrypted Volumes.
Jan 31 00:34:55 np0005603500 systemd: Stopped target Switch Root.
Jan 31 00:34:55 np0005603500 systemd: Stopped target Initrd File Systems.
Jan 31 00:34:55 np0005603500 systemd: Stopped target Initrd Root File System.
Jan 31 00:34:55 np0005603500 systemd: Reached target Local Integrity Protected Volumes.
Jan 31 00:34:55 np0005603500 systemd: Reached target Path Units.
Jan 31 00:34:55 np0005603500 systemd: Reached target rpc_pipefs.target.
Jan 31 00:34:55 np0005603500 systemd: Reached target Slice Units.
Jan 31 00:34:55 np0005603500 systemd: Reached target Swaps.
Jan 31 00:34:55 np0005603500 systemd: Reached target Local Verity Protected Volumes.
Jan 31 00:34:55 np0005603500 systemd: Listening on RPCbind Server Activation Socket.
Jan 31 00:34:55 np0005603500 systemd: Reached target RPC Port Mapper.
Jan 31 00:34:55 np0005603500 systemd: Listening on Process Core Dump Socket.
Jan 31 00:34:55 np0005603500 systemd: Listening on initctl Compatibility Named Pipe.
Jan 31 00:34:55 np0005603500 systemd: Listening on udev Control Socket.
Jan 31 00:34:55 np0005603500 systemd: Listening on udev Kernel Socket.
Jan 31 00:34:55 np0005603500 systemd: Mounting Huge Pages File System...
Jan 31 00:34:55 np0005603500 systemd: Mounting POSIX Message Queue File System...
Jan 31 00:34:55 np0005603500 systemd: Mounting Kernel Debug File System...
Jan 31 00:34:55 np0005603500 systemd: Mounting Kernel Trace File System...
Jan 31 00:34:55 np0005603500 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 31 00:34:55 np0005603500 systemd: Starting Create List of Static Device Nodes...
Jan 31 00:34:55 np0005603500 systemd: Starting Load Kernel Module configfs...
Jan 31 00:34:55 np0005603500 systemd: Starting Load Kernel Module drm...
Jan 31 00:34:55 np0005603500 systemd: Starting Load Kernel Module efi_pstore...
Jan 31 00:34:55 np0005603500 systemd: Starting Load Kernel Module fuse...
Jan 31 00:34:55 np0005603500 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 31 00:34:55 np0005603500 systemd: systemd-fsck-root.service: Deactivated successfully.
Jan 31 00:34:55 np0005603500 systemd: Stopped File System Check on Root Device.
Jan 31 00:34:55 np0005603500 systemd: Stopped Journal Service.
Jan 31 00:34:55 np0005603500 systemd: Starting Journal Service...
Jan 31 00:34:55 np0005603500 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 31 00:34:55 np0005603500 systemd: Starting Generate network units from Kernel command line...
Jan 31 00:34:55 np0005603500 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 31 00:34:55 np0005603500 systemd: Starting Remount Root and Kernel File Systems...
Jan 31 00:34:55 np0005603500 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 31 00:34:55 np0005603500 systemd: Starting Apply Kernel Variables...
Jan 31 00:34:55 np0005603500 systemd: Starting Coldplug All udev Devices...
Jan 31 00:34:55 np0005603500 kernel: fuse: init (API version 7.37)
Jan 31 00:34:55 np0005603500 systemd: Mounted Huge Pages File System.
Jan 31 00:34:55 np0005603500 systemd: Mounted POSIX Message Queue File System.
Jan 31 00:34:55 np0005603500 systemd: Mounted Kernel Debug File System.
Jan 31 00:34:55 np0005603500 systemd-journald[677]: Journal started
Jan 31 00:34:55 np0005603500 systemd-journald[677]: Runtime Journal (/run/log/journal/bf0bc0bb03de29b24cba1cc9599cf5d0) is 8.0M, max 153.6M, 145.6M free.
Jan 31 00:34:55 np0005603500 systemd[1]: Queued start job for default target Multi-User System.
Jan 31 00:34:55 np0005603500 systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 31 00:34:55 np0005603500 systemd: Started Journal Service.
Jan 31 00:34:55 np0005603500 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 31 00:34:55 np0005603500 systemd[1]: Mounted Kernel Trace File System.
Jan 31 00:34:55 np0005603500 systemd[1]: Finished Create List of Static Device Nodes.
Jan 31 00:34:55 np0005603500 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 31 00:34:55 np0005603500 systemd[1]: Finished Load Kernel Module configfs.
Jan 31 00:34:55 np0005603500 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 31 00:34:55 np0005603500 systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 31 00:34:55 np0005603500 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 31 00:34:55 np0005603500 systemd[1]: Finished Load Kernel Module fuse.
Jan 31 00:34:55 np0005603500 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 31 00:34:55 np0005603500 systemd[1]: Finished Generate network units from Kernel command line.
Jan 31 00:34:55 np0005603500 kernel: ACPI: bus type drm_connector registered
Jan 31 00:34:55 np0005603500 systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 31 00:34:55 np0005603500 systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 31 00:34:55 np0005603500 systemd[1]: Finished Load Kernel Module drm.
Jan 31 00:34:55 np0005603500 systemd[1]: Finished Apply Kernel Variables.
Jan 31 00:34:55 np0005603500 systemd[1]: Mounting FUSE Control File System...
Jan 31 00:34:55 np0005603500 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 31 00:34:55 np0005603500 systemd[1]: Starting Rebuild Hardware Database...
Jan 31 00:34:55 np0005603500 systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 31 00:34:55 np0005603500 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 31 00:34:55 np0005603500 systemd[1]: Starting Load/Save OS Random Seed...
Jan 31 00:34:55 np0005603500 systemd[1]: Starting Create System Users...
Jan 31 00:34:55 np0005603500 systemd[1]: Mounted FUSE Control File System.
Jan 31 00:34:55 np0005603500 systemd[1]: Finished Coldplug All udev Devices.
Jan 31 00:34:55 np0005603500 systemd-journald[677]: Runtime Journal (/run/log/journal/bf0bc0bb03de29b24cba1cc9599cf5d0) is 8.0M, max 153.6M, 145.6M free.
Jan 31 00:34:55 np0005603500 systemd-journald[677]: Received client request to flush runtime journal.
Jan 31 00:34:55 np0005603500 systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 31 00:34:55 np0005603500 systemd[1]: Finished Create System Users.
Jan 31 00:34:55 np0005603500 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 31 00:34:55 np0005603500 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 31 00:34:55 np0005603500 systemd[1]: Reached target Preparation for Local File Systems.
Jan 31 00:34:55 np0005603500 systemd[1]: Reached target Local File Systems.
Jan 31 00:34:55 np0005603500 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 31 00:34:55 np0005603500 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 31 00:34:55 np0005603500 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 31 00:34:55 np0005603500 systemd[1]: Starting Automatic Boot Loader Update...
Jan 31 00:34:55 np0005603500 systemd[1]: Starting Create Volatile Files and Directories...
Jan 31 00:34:55 np0005603500 systemd[1]: Finished Load/Save OS Random Seed.
Jan 31 00:34:55 np0005603500 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 31 00:34:55 np0005603500 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 31 00:34:55 np0005603500 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 31 00:34:55 np0005603500 bootctl[695]: Couldn't find EFI system partition, skipping.
Jan 31 00:34:55 np0005603500 systemd[1]: Finished Automatic Boot Loader Update.
Jan 31 00:34:55 np0005603500 systemd[1]: Finished Create Volatile Files and Directories.
Jan 31 00:34:55 np0005603500 systemd[1]: Starting Security Auditing Service...
Jan 31 00:34:55 np0005603500 systemd[1]: Starting RPC Bind...
Jan 31 00:34:55 np0005603500 systemd[1]: Starting Rebuild Journal Catalog...
Jan 31 00:34:55 np0005603500 auditd[701]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 31 00:34:55 np0005603500 auditd[701]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 31 00:34:55 np0005603500 systemd[1]: Finished Rebuild Journal Catalog.
Jan 31 00:34:55 np0005603500 systemd[1]: Started RPC Bind.
Jan 31 00:34:55 np0005603500 augenrules[706]: /sbin/augenrules: No change
Jan 31 00:34:55 np0005603500 augenrules[721]: No rules
Jan 31 00:34:55 np0005603500 augenrules[721]: enabled 1
Jan 31 00:34:55 np0005603500 augenrules[721]: failure 1
Jan 31 00:34:55 np0005603500 augenrules[721]: pid 701
Jan 31 00:34:55 np0005603500 augenrules[721]: rate_limit 0
Jan 31 00:34:55 np0005603500 augenrules[721]: backlog_limit 8192
Jan 31 00:34:55 np0005603500 augenrules[721]: lost 0
Jan 31 00:34:55 np0005603500 augenrules[721]: backlog 4
Jan 31 00:34:55 np0005603500 augenrules[721]: backlog_wait_time 60000
Jan 31 00:34:55 np0005603500 augenrules[721]: backlog_wait_time_actual 0
Jan 31 00:34:55 np0005603500 augenrules[721]: enabled 1
Jan 31 00:34:55 np0005603500 augenrules[721]: failure 1
Jan 31 00:34:55 np0005603500 augenrules[721]: pid 701
Jan 31 00:34:55 np0005603500 augenrules[721]: rate_limit 0
Jan 31 00:34:55 np0005603500 augenrules[721]: backlog_limit 8192
Jan 31 00:34:55 np0005603500 augenrules[721]: lost 0
Jan 31 00:34:55 np0005603500 augenrules[721]: backlog 4
Jan 31 00:34:55 np0005603500 augenrules[721]: backlog_wait_time 60000
Jan 31 00:34:55 np0005603500 augenrules[721]: backlog_wait_time_actual 0
Jan 31 00:34:55 np0005603500 augenrules[721]: enabled 1
Jan 31 00:34:55 np0005603500 augenrules[721]: failure 1
Jan 31 00:34:55 np0005603500 augenrules[721]: pid 701
Jan 31 00:34:55 np0005603500 augenrules[721]: rate_limit 0
Jan 31 00:34:55 np0005603500 augenrules[721]: backlog_limit 8192
Jan 31 00:34:55 np0005603500 augenrules[721]: lost 0
Jan 31 00:34:55 np0005603500 augenrules[721]: backlog 4
Jan 31 00:34:55 np0005603500 augenrules[721]: backlog_wait_time 60000
Jan 31 00:34:55 np0005603500 augenrules[721]: backlog_wait_time_actual 0
Jan 31 00:34:55 np0005603500 systemd[1]: Started Security Auditing Service.
Jan 31 00:34:55 np0005603500 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 31 00:34:55 np0005603500 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 31 00:34:56 np0005603500 systemd[1]: Finished Rebuild Hardware Database.
Jan 31 00:34:56 np0005603500 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 31 00:34:56 np0005603500 systemd-udevd[729]: Using default interface naming scheme 'rhel-9.0'.
Jan 31 00:34:56 np0005603500 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 31 00:34:56 np0005603500 systemd[1]: Starting Load Kernel Module configfs...
Jan 31 00:34:56 np0005603500 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 31 00:34:56 np0005603500 systemd[1]: Finished Load Kernel Module configfs.
Jan 31 00:34:56 np0005603500 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 31 00:34:56 np0005603500 systemd-udevd[744]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 00:34:56 np0005603500 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 31 00:34:56 np0005603500 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 31 00:34:56 np0005603500 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 31 00:34:56 np0005603500 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 31 00:34:56 np0005603500 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 31 00:34:56 np0005603500 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 31 00:34:56 np0005603500 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 31 00:34:56 np0005603500 kernel: Console: switching to colour dummy device 80x25
Jan 31 00:34:56 np0005603500 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 31 00:34:56 np0005603500 kernel: [drm] features: -context_init
Jan 31 00:34:56 np0005603500 kernel: [drm] number of scanouts: 1
Jan 31 00:34:56 np0005603500 kernel: [drm] number of cap sets: 0
Jan 31 00:34:56 np0005603500 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 31 00:34:56 np0005603500 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 31 00:34:56 np0005603500 kernel: Console: switching to colour frame buffer device 128x48
Jan 31 00:34:56 np0005603500 systemd[1]: Starting Update is Completed...
Jan 31 00:34:56 np0005603500 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 31 00:34:56 np0005603500 systemd[1]: Finished Update is Completed.
Jan 31 00:34:56 np0005603500 systemd[1]: Reached target System Initialization.
Jan 31 00:34:56 np0005603500 systemd[1]: Started dnf makecache --timer.
Jan 31 00:34:56 np0005603500 systemd[1]: Started Daily rotation of log files.
Jan 31 00:34:56 np0005603500 systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 31 00:34:56 np0005603500 systemd[1]: Reached target Timer Units.
Jan 31 00:34:56 np0005603500 systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 31 00:34:56 np0005603500 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 31 00:34:56 np0005603500 kernel: kvm_amd: TSC scaling supported
Jan 31 00:34:56 np0005603500 kernel: kvm_amd: Nested Virtualization enabled
Jan 31 00:34:56 np0005603500 kernel: kvm_amd: Nested Paging enabled
Jan 31 00:34:56 np0005603500 kernel: kvm_amd: LBR virtualization supported
Jan 31 00:34:56 np0005603500 systemd[1]: Reached target Socket Units.
Jan 31 00:34:56 np0005603500 systemd[1]: Starting D-Bus System Message Bus...
Jan 31 00:34:56 np0005603500 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 31 00:34:56 np0005603500 systemd[1]: Started D-Bus System Message Bus.
Jan 31 00:34:56 np0005603500 systemd[1]: Reached target Basic System.
Jan 31 00:34:56 np0005603500 dbus-broker-lau[789]: Ready
Jan 31 00:34:56 np0005603500 systemd[1]: Starting NTP client/server...
Jan 31 00:34:56 np0005603500 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 31 00:34:56 np0005603500 systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 31 00:34:56 np0005603500 systemd[1]: Starting IPv4 firewall with iptables...
Jan 31 00:34:56 np0005603500 systemd[1]: Started irqbalance daemon.
Jan 31 00:34:56 np0005603500 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 31 00:34:56 np0005603500 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 31 00:34:56 np0005603500 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 31 00:34:56 np0005603500 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 31 00:34:56 np0005603500 systemd[1]: Reached target sshd-keygen.target.
Jan 31 00:34:56 np0005603500 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 31 00:34:56 np0005603500 systemd[1]: Reached target User and Group Name Lookups.
Jan 31 00:34:56 np0005603500 systemd[1]: Starting User Login Management...
Jan 31 00:34:56 np0005603500 systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 31 00:34:56 np0005603500 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 31 00:34:56 np0005603500 chronyd[830]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 31 00:34:56 np0005603500 systemd-logind[821]: New seat seat0.
Jan 31 00:34:56 np0005603500 systemd-logind[821]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 31 00:34:56 np0005603500 systemd-logind[821]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 31 00:34:56 np0005603500 systemd[1]: Started User Login Management.
Jan 31 00:34:56 np0005603500 chronyd[830]: Loaded 0 symmetric keys
Jan 31 00:34:56 np0005603500 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 31 00:34:56 np0005603500 chronyd[830]: Using right/UTC timezone to obtain leap second data
Jan 31 00:34:56 np0005603500 chronyd[830]: Loaded seccomp filter (level 2)
Jan 31 00:34:56 np0005603500 systemd[1]: Started NTP client/server.
Jan 31 00:34:56 np0005603500 iptables.init[815]: iptables: Applying firewall rules: [  OK  ]
Jan 31 00:34:56 np0005603500 systemd[1]: Finished IPv4 firewall with iptables.
Jan 31 00:34:57 np0005603500 cloud-init[839]: Cloud-init v. 24.4-8.el9 running 'init-local' at Sat, 31 Jan 2026 05:34:57 +0000. Up 9.22 seconds.
Jan 31 00:34:58 np0005603500 systemd[1]: run-cloud\x2dinit-tmp-tmp93ancu03.mount: Deactivated successfully.
Jan 31 00:34:58 np0005603500 systemd[1]: Starting Hostname Service...
Jan 31 00:34:58 np0005603500 systemd[1]: Started Hostname Service.
Jan 31 00:34:58 np0005603500 systemd-hostnamed[853]: Hostname set to <np0005603500.novalocal> (static)
Jan 31 00:34:58 np0005603500 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 31 00:34:58 np0005603500 systemd[1]: Reached target Preparation for Network.
Jan 31 00:34:58 np0005603500 systemd[1]: Starting Network Manager...
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.4949] NetworkManager (version 1.54.3-2.el9) is starting... (boot:3dddb628-1b36-4865-82fe-2cb3f5410e26)
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.4954] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.5703] manager[0x561a1cb69000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.5744] hostname: hostname: using hostnamed
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.5744] hostname: static hostname changed from (none) to "np0005603500.novalocal"
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.5752] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.5976] manager[0x561a1cb69000]: rfkill: Wi-Fi hardware radio set enabled
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.5976] manager[0x561a1cb69000]: rfkill: WWAN hardware radio set enabled
Jan 31 00:34:58 np0005603500 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.6686] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.6687] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.6688] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.6689] manager: Networking is enabled by state file
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.6692] settings: Loaded settings plugin: keyfile (internal)
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.6792] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.6830] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 31 00:34:58 np0005603500 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.6958] dhcp: init: Using DHCP client 'internal'
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.6963] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.6983] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.6998] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7014] device (lo): Activation: starting connection 'lo' (5b1c2e8d-86e8-4488-ae63-5b9b7ed60e29)
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7028] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7033] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 00:34:58 np0005603500 systemd[1]: Started Network Manager.
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7069] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7079] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7084] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7088] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7092] device (eth0): carrier: link connected
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7099] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 31 00:34:58 np0005603500 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7112] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7129] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7133] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7140] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7141] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 31 00:34:58 np0005603500 systemd[1]: Reached target Network.
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7153] device (lo): Activation: successful, device activated.
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7167] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7172] manager: NetworkManager state is now CONNECTING
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7176] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7193] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7200] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7252] dhcp4 (eth0): state changed new lease, address=38.102.83.9
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7262] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7289] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7311] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7314] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7321] manager: NetworkManager state is now CONNECTED_SITE
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7325] device (eth0): Activation: successful, device activated.
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7332] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 31 00:34:58 np0005603500 NetworkManager[857]: <info>  [1769837698.7337] manager: startup complete
Jan 31 00:34:58 np0005603500 systemd[1]: Starting Network Manager Wait Online...
Jan 31 00:34:58 np0005603500 systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 31 00:34:58 np0005603500 systemd[1]: Finished Network Manager Wait Online.
Jan 31 00:34:58 np0005603500 systemd[1]: Started GSSAPI Proxy Daemon.
Jan 31 00:34:58 np0005603500 systemd[1]: Starting Cloud-init: Network Stage...
Jan 31 00:34:58 np0005603500 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 31 00:34:58 np0005603500 systemd[1]: Reached target NFS client services.
Jan 31 00:34:58 np0005603500 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 31 00:34:58 np0005603500 systemd[1]: Reached target Remote File Systems.
Jan 31 00:34:58 np0005603500 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 31 00:34:59 np0005603500 cloud-init[921]: Cloud-init v. 24.4-8.el9 running 'init' at Sat, 31 Jan 2026 05:34:59 +0000. Up 10.38 seconds.
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: |  eth0  | True |         38.102.83.9          | 255.255.255.0 | global | fa:16:3e:21:a2:e0 |
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: |  eth0  | True | fe80::f816:3eff:fe21:a2e0/64 |       .       |  link  | fa:16:3e:21:a2:e0 |
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 31 00:34:59 np0005603500 cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 31 00:35:00 np0005603500 cloud-init[921]: Generating public/private rsa key pair.
Jan 31 00:35:00 np0005603500 cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 31 00:35:00 np0005603500 cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 31 00:35:00 np0005603500 cloud-init[921]: The key fingerprint is:
Jan 31 00:35:00 np0005603500 cloud-init[921]: SHA256:htnZ7a6ZDsrqy95mrO3f3fNOOJsuZkaCTiAGDdysgh0 root@np0005603500.novalocal
Jan 31 00:35:00 np0005603500 cloud-init[921]: The key's randomart image is:
Jan 31 00:35:00 np0005603500 cloud-init[921]: +---[RSA 3072]----+
Jan 31 00:35:00 np0005603500 cloud-init[921]: | ..=             |
Jan 31 00:35:00 np0005603500 cloud-init[921]: |  E +            |
Jan 31 00:35:00 np0005603500 cloud-init[921]: |.. +             |
Jan 31 00:35:00 np0005603500 cloud-init[921]: |o o o .+ o .     |
Jan 31 00:35:00 np0005603500 cloud-init[921]: | . . .o.S.. .    |
Jan 31 00:35:00 np0005603500 cloud-init[921]: |       .o ...  . |
Jan 31 00:35:00 np0005603500 cloud-init[921]: |     . o.  o. o .|
Jan 31 00:35:00 np0005603500 cloud-init[921]: |   . ++..o ==..= |
Jan 31 00:35:00 np0005603500 cloud-init[921]: |   oBB*...*=oo=+o|
Jan 31 00:35:00 np0005603500 cloud-init[921]: +----[SHA256]-----+
Jan 31 00:35:00 np0005603500 cloud-init[921]: Generating public/private ecdsa key pair.
Jan 31 00:35:00 np0005603500 cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 31 00:35:00 np0005603500 cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 31 00:35:00 np0005603500 cloud-init[921]: The key fingerprint is:
Jan 31 00:35:00 np0005603500 cloud-init[921]: SHA256:h1PdkYw+Yy5GR23X7ygktnt1T1QrCMiwUtIDFcZt5p4 root@np0005603500.novalocal
Jan 31 00:35:00 np0005603500 cloud-init[921]: The key's randomart image is:
Jan 31 00:35:00 np0005603500 cloud-init[921]: +---[ECDSA 256]---+
Jan 31 00:35:00 np0005603500 cloud-init[921]: |   o=*= .    +...|
Jan 31 00:35:00 np0005603500 cloud-init[921]: |    ++.* . .o.=.+|
Jan 31 00:35:00 np0005603500 cloud-init[921]: |   . .=   ooo...+|
Jan 31 00:35:00 np0005603500 cloud-init[921]: |    .  . o+.B. .o|
Jan 31 00:35:00 np0005603500 cloud-init[921]: |      . So.B o.+ |
Jan 31 00:35:00 np0005603500 cloud-init[921]: |       E o+ o o +|
Jan 31 00:35:00 np0005603500 cloud-init[921]: |         . o o o.|
Jan 31 00:35:00 np0005603500 cloud-init[921]: |          . .   .|
Jan 31 00:35:00 np0005603500 cloud-init[921]: |           .     |
Jan 31 00:35:00 np0005603500 cloud-init[921]: +----[SHA256]-----+
Jan 31 00:35:00 np0005603500 cloud-init[921]: Generating public/private ed25519 key pair.
Jan 31 00:35:00 np0005603500 cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 31 00:35:00 np0005603500 cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 31 00:35:00 np0005603500 cloud-init[921]: The key fingerprint is:
Jan 31 00:35:00 np0005603500 cloud-init[921]: SHA256:mMry27+JRgcp+ofp3i9fGR93RK0K3/kQNk2oj4KtajE root@np0005603500.novalocal
Jan 31 00:35:00 np0005603500 cloud-init[921]: The key's randomart image is:
Jan 31 00:35:00 np0005603500 cloud-init[921]: +--[ED25519 256]--+
Jan 31 00:35:00 np0005603500 cloud-init[921]: |               .o|
Jan 31 00:35:00 np0005603500 cloud-init[921]: |              ..o|
Jan 31 00:35:00 np0005603500 cloud-init[921]: |       .     . +.|
Jan 31 00:35:00 np0005603500 cloud-init[921]: |    . oo  . . =..|
Jan 31 00:35:00 np0005603500 cloud-init[921]: |   . .o.So.o.*.+.|
Jan 31 00:35:00 np0005603500 cloud-init[921]: |  .. .E o o++o=. |
Jan 31 00:35:00 np0005603500 cloud-init[921]: |  ..o+ + .o..  o |
Jan 31 00:35:00 np0005603500 cloud-init[921]: |   o+o=..o      .|
Jan 31 00:35:00 np0005603500 cloud-init[921]: |   o===**.       |
Jan 31 00:35:00 np0005603500 cloud-init[921]: +----[SHA256]-----+
Jan 31 00:35:00 np0005603500 systemd[1]: Finished Cloud-init: Network Stage.
Jan 31 00:35:00 np0005603500 systemd[1]: Reached target Cloud-config availability.
Jan 31 00:35:00 np0005603500 systemd[1]: Reached target Network is Online.
Jan 31 00:35:00 np0005603500 systemd[1]: Starting Cloud-init: Config Stage...
Jan 31 00:35:00 np0005603500 systemd[1]: Starting Crash recovery kernel arming...
Jan 31 00:35:00 np0005603500 systemd[1]: Starting Notify NFS peers of a restart...
Jan 31 00:35:00 np0005603500 systemd[1]: Starting System Logging Service...
Jan 31 00:35:00 np0005603500 systemd[1]: Starting OpenSSH server daemon...
Jan 31 00:35:00 np0005603500 sm-notify[1004]: Version 2.5.4 starting
Jan 31 00:35:00 np0005603500 systemd[1]: Starting Permit User Sessions...
Jan 31 00:35:00 np0005603500 systemd[1]: Started Notify NFS peers of a restart.
Jan 31 00:35:00 np0005603500 systemd[1]: Finished Permit User Sessions.
Jan 31 00:35:00 np0005603500 systemd[1]: Started Command Scheduler.
Jan 31 00:35:00 np0005603500 systemd[1]: Started Getty on tty1.
Jan 31 00:35:00 np0005603500 systemd[1]: Started Serial Getty on ttyS0.
Jan 31 00:35:00 np0005603500 systemd[1]: Reached target Login Prompts.
Jan 31 00:35:00 np0005603500 systemd[1]: Started OpenSSH server daemon.
Jan 31 00:35:00 np0005603500 rsyslogd[1005]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1005" x-info="https://www.rsyslog.com"] start
Jan 31 00:35:00 np0005603500 rsyslogd[1005]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 31 00:35:00 np0005603500 systemd[1]: Started System Logging Service.
Jan 31 00:35:00 np0005603500 systemd[1]: Reached target Multi-User System.
Jan 31 00:35:00 np0005603500 systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 31 00:35:00 np0005603500 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 31 00:35:00 np0005603500 systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 31 00:35:00 np0005603500 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 00:35:00 np0005603500 kdumpctl[1014]: kdump: No kdump initial ramdisk found.
Jan 31 00:35:00 np0005603500 kdumpctl[1014]: kdump: Rebuilding /boot/initramfs-5.14.0-665.el9.x86_64kdump.img
Jan 31 00:35:00 np0005603500 cloud-init[1113]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Sat, 31 Jan 2026 05:35:00 +0000. Up 11.96 seconds.
Jan 31 00:35:00 np0005603500 systemd[1]: Finished Cloud-init: Config Stage.
Jan 31 00:35:00 np0005603500 systemd[1]: Starting Cloud-init: Final Stage...
Jan 31 00:35:00 np0005603500 dracut[1267]: dracut-057-102.git20250818.el9
Jan 31 00:35:01 np0005603500 cloud-init[1283]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Sat, 31 Jan 2026 05:35:00 +0000. Up 12.34 seconds.
Jan 31 00:35:01 np0005603500 dracut[1269]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-665.el9.x86_64kdump.img 5.14.0-665.el9.x86_64
Jan 31 00:35:01 np0005603500 cloud-init[1299]: #############################################################
Jan 31 00:35:01 np0005603500 cloud-init[1302]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 31 00:35:01 np0005603500 cloud-init[1307]: 256 SHA256:h1PdkYw+Yy5GR23X7ygktnt1T1QrCMiwUtIDFcZt5p4 root@np0005603500.novalocal (ECDSA)
Jan 31 00:35:01 np0005603500 cloud-init[1311]: 256 SHA256:mMry27+JRgcp+ofp3i9fGR93RK0K3/kQNk2oj4KtajE root@np0005603500.novalocal (ED25519)
Jan 31 00:35:01 np0005603500 cloud-init[1319]: 3072 SHA256:htnZ7a6ZDsrqy95mrO3f3fNOOJsuZkaCTiAGDdysgh0 root@np0005603500.novalocal (RSA)
Jan 31 00:35:01 np0005603500 cloud-init[1320]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 31 00:35:01 np0005603500 cloud-init[1323]: #############################################################
Jan 31 00:35:01 np0005603500 cloud-init[1283]: Cloud-init v. 24.4-8.el9 finished at Sat, 31 Jan 2026 05:35:01 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 12.50 seconds
Jan 31 00:35:01 np0005603500 systemd[1]: Finished Cloud-init: Final Stage.
Jan 31 00:35:01 np0005603500 systemd[1]: Reached target Cloud-init target.
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 31 00:35:01 np0005603500 dracut[1269]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: memstrack is not available
Jan 31 00:35:02 np0005603500 dracut[1269]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 31 00:35:02 np0005603500 dracut[1269]: memstrack is not available
Jan 31 00:35:02 np0005603500 dracut[1269]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 31 00:35:02 np0005603500 dracut[1269]: *** Including module: systemd ***
Jan 31 00:35:02 np0005603500 dracut[1269]: *** Including module: fips ***
Jan 31 00:35:02 np0005603500 dracut[1269]: *** Including module: systemd-initrd ***
Jan 31 00:35:02 np0005603500 dracut[1269]: *** Including module: i18n ***
Jan 31 00:35:02 np0005603500 dracut[1269]: *** Including module: drm ***
Jan 31 00:35:03 np0005603500 dracut[1269]: *** Including module: prefixdevname ***
Jan 31 00:35:03 np0005603500 dracut[1269]: *** Including module: kernel-modules ***
Jan 31 00:35:03 np0005603500 kernel: block vda: the capability attribute has been deprecated.
Jan 31 00:35:03 np0005603500 chronyd[830]: Selected source 167.160.187.12 (2.centos.pool.ntp.org)
Jan 31 00:35:03 np0005603500 chronyd[830]: System clock TAI offset set to 37 seconds
Jan 31 00:35:03 np0005603500 dracut[1269]: *** Including module: kernel-modules-extra ***
Jan 31 00:35:03 np0005603500 dracut[1269]: *** Including module: qemu ***
Jan 31 00:35:04 np0005603500 dracut[1269]: *** Including module: fstab-sys ***
Jan 31 00:35:04 np0005603500 dracut[1269]: *** Including module: rootfs-block ***
Jan 31 00:35:04 np0005603500 dracut[1269]: *** Including module: terminfo ***
Jan 31 00:35:04 np0005603500 dracut[1269]: *** Including module: udev-rules ***
Jan 31 00:35:04 np0005603500 dracut[1269]: Skipping udev rule: 91-permissions.rules
Jan 31 00:35:04 np0005603500 dracut[1269]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 31 00:35:04 np0005603500 dracut[1269]: *** Including module: virtiofs ***
Jan 31 00:35:04 np0005603500 dracut[1269]: *** Including module: dracut-systemd ***
Jan 31 00:35:04 np0005603500 dracut[1269]: *** Including module: usrmount ***
Jan 31 00:35:04 np0005603500 dracut[1269]: *** Including module: base ***
Jan 31 00:35:04 np0005603500 dracut[1269]: *** Including module: fs-lib ***
Jan 31 00:35:05 np0005603500 dracut[1269]: *** Including module: kdumpbase ***
Jan 31 00:35:05 np0005603500 dracut[1269]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 31 00:35:05 np0005603500 dracut[1269]:  microcode_ctl module: mangling fw_dir
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: configuration "intel" is ignored
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 31 00:35:05 np0005603500 chronyd[830]: Selected source 149.56.19.163 (2.centos.pool.ntp.org)
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 31 00:35:05 np0005603500 dracut[1269]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 31 00:35:05 np0005603500 dracut[1269]: *** Including module: openssl ***
Jan 31 00:35:05 np0005603500 dracut[1269]: *** Including module: shutdown ***
Jan 31 00:35:05 np0005603500 dracut[1269]: *** Including module: squash ***
Jan 31 00:35:05 np0005603500 dracut[1269]: *** Including modules done ***
Jan 31 00:35:05 np0005603500 dracut[1269]: *** Installing kernel module dependencies ***
Jan 31 00:35:06 np0005603500 irqbalance[816]: Cannot change IRQ 25 affinity: Operation not permitted
Jan 31 00:35:06 np0005603500 irqbalance[816]: IRQ 25 affinity is now unmanaged
Jan 31 00:35:06 np0005603500 irqbalance[816]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 31 00:35:06 np0005603500 irqbalance[816]: IRQ 31 affinity is now unmanaged
Jan 31 00:35:06 np0005603500 irqbalance[816]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 31 00:35:06 np0005603500 irqbalance[816]: IRQ 28 affinity is now unmanaged
Jan 31 00:35:06 np0005603500 irqbalance[816]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 31 00:35:06 np0005603500 irqbalance[816]: IRQ 32 affinity is now unmanaged
Jan 31 00:35:06 np0005603500 irqbalance[816]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 31 00:35:06 np0005603500 irqbalance[816]: IRQ 30 affinity is now unmanaged
Jan 31 00:35:06 np0005603500 irqbalance[816]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 31 00:35:06 np0005603500 irqbalance[816]: IRQ 29 affinity is now unmanaged
Jan 31 00:35:06 np0005603500 dracut[1269]: *** Installing kernel module dependencies done ***
Jan 31 00:35:06 np0005603500 dracut[1269]: *** Resolving executable dependencies ***
Jan 31 00:35:08 np0005603500 dracut[1269]: *** Resolving executable dependencies done ***
Jan 31 00:35:08 np0005603500 dracut[1269]: *** Generating early-microcode cpio image ***
Jan 31 00:35:08 np0005603500 dracut[1269]: *** Store current command line parameters ***
Jan 31 00:35:08 np0005603500 dracut[1269]: Stored kernel commandline:
Jan 31 00:35:08 np0005603500 dracut[1269]: No dracut internal kernel commandline stored in the initramfs
Jan 31 00:35:08 np0005603500 dracut[1269]: *** Install squash loader ***
Jan 31 00:35:08 np0005603500 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 31 00:35:09 np0005603500 dracut[1269]: *** Squashing the files inside the initramfs ***
Jan 31 00:35:10 np0005603500 dracut[1269]: *** Squashing the files inside the initramfs done ***
Jan 31 00:35:10 np0005603500 dracut[1269]: *** Creating image file '/boot/initramfs-5.14.0-665.el9.x86_64kdump.img' ***
Jan 31 00:35:10 np0005603500 dracut[1269]: *** Hardlinking files ***
Jan 31 00:35:10 np0005603500 dracut[1269]: *** Hardlinking files done ***
Jan 31 00:35:10 np0005603500 dracut[1269]: *** Creating initramfs image file '/boot/initramfs-5.14.0-665.el9.x86_64kdump.img' done ***
Jan 31 00:35:11 np0005603500 kdumpctl[1014]: kdump: kexec: loaded kdump kernel
Jan 31 00:35:11 np0005603500 kdumpctl[1014]: kdump: Starting kdump: [OK]
Jan 31 00:35:11 np0005603500 systemd[1]: Finished Crash recovery kernel arming.
Jan 31 00:35:11 np0005603500 systemd[1]: Startup finished in 1.180s (kernel) + 4.438s (initrd) + 17.094s (userspace) = 22.714s.
Jan 31 00:35:17 np0005603500 systemd[1]: Created slice User Slice of UID 1000.
Jan 31 00:35:17 np0005603500 systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 31 00:35:17 np0005603500 systemd-logind[821]: New session 1 of user zuul.
Jan 31 00:35:17 np0005603500 systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 31 00:35:17 np0005603500 systemd[1]: Starting User Manager for UID 1000...
Jan 31 00:35:17 np0005603500 systemd[4307]: Queued start job for default target Main User Target.
Jan 31 00:35:17 np0005603500 systemd[4307]: Created slice User Application Slice.
Jan 31 00:35:17 np0005603500 systemd[4307]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 31 00:35:17 np0005603500 systemd[4307]: Started Daily Cleanup of User's Temporary Directories.
Jan 31 00:35:17 np0005603500 systemd[4307]: Reached target Paths.
Jan 31 00:35:17 np0005603500 systemd[4307]: Reached target Timers.
Jan 31 00:35:17 np0005603500 systemd[4307]: Starting D-Bus User Message Bus Socket...
Jan 31 00:35:17 np0005603500 systemd[4307]: Starting Create User's Volatile Files and Directories...
Jan 31 00:35:18 np0005603500 systemd[4307]: Finished Create User's Volatile Files and Directories.
Jan 31 00:35:18 np0005603500 systemd[4307]: Listening on D-Bus User Message Bus Socket.
Jan 31 00:35:18 np0005603500 systemd[4307]: Reached target Sockets.
Jan 31 00:35:18 np0005603500 systemd[4307]: Reached target Basic System.
Jan 31 00:35:18 np0005603500 systemd[4307]: Reached target Main User Target.
Jan 31 00:35:18 np0005603500 systemd[4307]: Startup finished in 136ms.
Jan 31 00:35:18 np0005603500 systemd[1]: Started User Manager for UID 1000.
Jan 31 00:35:18 np0005603500 systemd[1]: Started Session 1 of User zuul.
Jan 31 00:35:18 np0005603500 python3[4389]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 00:35:21 np0005603500 python3[4417]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 00:35:27 np0005603500 python3[4475]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 00:35:28 np0005603500 python3[4515]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 31 00:35:28 np0005603500 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 31 00:35:30 np0005603500 python3[4543]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDi3wxsBmQi5KJC5hWXUTxbXHZc4UO/fPqHa0lCiDjwlTDX2+RX7Ky9TUiHQ0Pp0/oFT/LJ/CGNdtQPmi+6KdrVAk7QPaE3+ACJcTMl5srx0FR5FwvmujmloiZQGQ+yw1GrvN8sDtq9dY7cNMNJNiUg6xHHusK9NREW1OdxiWgk0ftu7TGX6qzOwLV4cAgSn8hyziLZBS31ensVbwLCwsLHggPQUItQxkuZZgnM257ggF6uTseRB2N+dYUKg8InKEKARwbeiLM3Q7hZeObg8TUMDoqpuO7gXMyYdTsI4axKJxpZuvbeMqa1fW659pywTTLYVigzJa/uRRuzUp9e59eSRkK7IvXBe0DDVumv9VRwwM8qgmgJCrK7YgVO/b7r3K79GA1c39Cuh3AVIIXBSN8FZWYa8NrEXLaZt4z95ucTiwQdWLmXHLD+64wwo6XjfYcYnrb8DrTvS7Atq7QTZm2D+KRU5VSbDzY83DfPNzEqhAJF7jiiGo4X6OJejCGRfNk= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:30 np0005603500 python3[4567]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:35:31 np0005603500 python3[4666]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 00:35:31 np0005603500 python3[4737]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769837730.9176114-207-118617251893062/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=c8261a2e04ae49778b57227944eb5077_id_rsa follow=False checksum=952ff2894b55490ce7a7d9db26bb1b178316b149 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:35:32 np0005603500 python3[4860]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 00:35:32 np0005603500 python3[4931]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769837731.9167264-240-101236721403889/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=c8261a2e04ae49778b57227944eb5077_id_rsa.pub follow=False checksum=554972c8a3505c1e51a171429ac5dbd85f0b77ac backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:35:34 np0005603500 python3[4979]: ansible-ping Invoked with data=pong
Jan 31 00:35:35 np0005603500 python3[5003]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 00:35:36 np0005603500 python3[5061]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 31 00:35:37 np0005603500 python3[5093]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:35:37 np0005603500 python3[5117]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:35:38 np0005603500 python3[5141]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:35:38 np0005603500 python3[5165]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:35:38 np0005603500 python3[5189]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:35:38 np0005603500 python3[5213]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:35:40 np0005603500 python3[5239]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:35:40 np0005603500 python3[5317]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 00:35:41 np0005603500 python3[5390]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769837740.5914164-21-184424177969903/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:35:42 np0005603500 python3[5438]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:42 np0005603500 python3[5462]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:42 np0005603500 python3[5486]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:42 np0005603500 python3[5510]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:43 np0005603500 python3[5534]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:43 np0005603500 python3[5558]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:43 np0005603500 python3[5582]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:43 np0005603500 python3[5606]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:44 np0005603500 python3[5630]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:44 np0005603500 python3[5654]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:44 np0005603500 python3[5678]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:44 np0005603500 python3[5702]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:45 np0005603500 python3[5726]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:45 np0005603500 python3[5750]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:45 np0005603500 python3[5774]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:45 np0005603500 python3[5798]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:46 np0005603500 python3[5822]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:46 np0005603500 python3[5846]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:46 np0005603500 python3[5870]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:46 np0005603500 python3[5894]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:47 np0005603500 python3[5918]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:47 np0005603500 python3[5942]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:47 np0005603500 python3[5966]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:48 np0005603500 python3[5990]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:48 np0005603500 python3[6014]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:48 np0005603500 python3[6038]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:35:51 np0005603500 python3[6064]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 31 00:35:51 np0005603500 systemd[1]: Starting Time & Date Service...
Jan 31 00:35:51 np0005603500 systemd[1]: Started Time & Date Service.
Jan 31 00:35:51 np0005603500 systemd-timedated[6066]: Changed time zone to 'UTC' (UTC).
Jan 31 00:35:52 np0005603500 python3[6095]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:35:52 np0005603500 python3[6171]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 00:35:53 np0005603500 python3[6242]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769837752.3858936-153-174991348981118/source _original_basename=tmpaxo1s48p follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:35:54 np0005603500 python3[6342]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 00:35:54 np0005603500 python3[6413]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769837753.9298475-183-37898782197921/source _original_basename=tmpfbiwo8qs follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:35:55 np0005603500 python3[6515]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 00:35:55 np0005603500 python3[6588]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769837754.9309635-231-242018484675728/source _original_basename=tmpbk9bfhdk follow=False checksum=aad8ab19ba5ef1801e0e8aebf96af2ca109a6077 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:35:56 np0005603500 python3[6636]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 00:35:56 np0005603500 python3[6662]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 00:35:56 np0005603500 python3[6742]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 00:35:57 np0005603500 python3[6815]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769837756.5410526-273-64613210921540/source _original_basename=tmpf5dzc1hv follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:35:57 np0005603500 python3[6866]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-3c98-1243-00000000001d-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 00:35:58 np0005603500 python3[6894]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163efc-24cc-3c98-1243-00000000001e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 31 00:35:59 np0005603500 python3[6922]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:36:17 np0005603500 python3[6948]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:36:21 np0005603500 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 31 00:36:56 np0005603500 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 31 00:36:56 np0005603500 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 31 00:36:56 np0005603500 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 31 00:36:56 np0005603500 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 31 00:36:56 np0005603500 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 31 00:36:56 np0005603500 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 31 00:36:56 np0005603500 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 31 00:36:56 np0005603500 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 31 00:36:56 np0005603500 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 31 00:36:56 np0005603500 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 31 00:36:56 np0005603500 NetworkManager[857]: <info>  [1769837816.7156] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 31 00:36:56 np0005603500 systemd-udevd[6952]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 00:36:56 np0005603500 NetworkManager[857]: <info>  [1769837816.7311] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 00:36:56 np0005603500 NetworkManager[857]: <info>  [1769837816.7337] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 31 00:36:56 np0005603500 NetworkManager[857]: <info>  [1769837816.7339] device (eth1): carrier: link connected
Jan 31 00:36:56 np0005603500 NetworkManager[857]: <info>  [1769837816.7340] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 31 00:36:56 np0005603500 NetworkManager[857]: <info>  [1769837816.7344] policy: auto-activating connection 'Wired connection 1' (0ae45ab3-5e67-3c3b-898d-4a6bebc55a91)
Jan 31 00:36:56 np0005603500 NetworkManager[857]: <info>  [1769837816.7348] device (eth1): Activation: starting connection 'Wired connection 1' (0ae45ab3-5e67-3c3b-898d-4a6bebc55a91)
Jan 31 00:36:56 np0005603500 NetworkManager[857]: <info>  [1769837816.7349] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 00:36:56 np0005603500 NetworkManager[857]: <info>  [1769837816.7351] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 00:36:56 np0005603500 NetworkManager[857]: <info>  [1769837816.7355] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 00:36:56 np0005603500 NetworkManager[857]: <info>  [1769837816.7358] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 31 00:36:57 np0005603500 python3[6978]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-aa1f-a873-0000000000fc-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 00:37:04 np0005603500 python3[7058]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 00:37:04 np0005603500 python3[7131]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769837824.096281-102-219991911901523/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=60e72a2e31664429fb68167eac599dc66d7ba66d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:37:05 np0005603500 python3[7181]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 00:37:05 np0005603500 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 31 00:37:05 np0005603500 systemd[1]: Stopped Network Manager Wait Online.
Jan 31 00:37:05 np0005603500 systemd[1]: Stopping Network Manager Wait Online...
Jan 31 00:37:05 np0005603500 NetworkManager[857]: <info>  [1769837825.5882] caught SIGTERM, shutting down normally.
Jan 31 00:37:05 np0005603500 systemd[1]: Stopping Network Manager...
Jan 31 00:37:05 np0005603500 NetworkManager[857]: <info>  [1769837825.5892] dhcp4 (eth0): canceled DHCP transaction
Jan 31 00:37:05 np0005603500 NetworkManager[857]: <info>  [1769837825.5892] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 31 00:37:05 np0005603500 NetworkManager[857]: <info>  [1769837825.5893] dhcp4 (eth0): state changed no lease
Jan 31 00:37:05 np0005603500 NetworkManager[857]: <info>  [1769837825.5897] manager: NetworkManager state is now CONNECTING
Jan 31 00:37:05 np0005603500 NetworkManager[857]: <info>  [1769837825.6036] dhcp4 (eth1): canceled DHCP transaction
Jan 31 00:37:05 np0005603500 NetworkManager[857]: <info>  [1769837825.6037] dhcp4 (eth1): state changed no lease
Jan 31 00:37:05 np0005603500 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 31 00:37:05 np0005603500 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 31 00:37:05 np0005603500 NetworkManager[857]: <info>  [1769837825.6328] exiting (success)
Jan 31 00:37:05 np0005603500 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 31 00:37:05 np0005603500 systemd[1]: Stopped Network Manager.
Jan 31 00:37:05 np0005603500 systemd[1]: NetworkManager.service: Consumed 1.133s CPU time, 10.0M memory peak.
Jan 31 00:37:05 np0005603500 systemd[1]: Starting Network Manager...
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.6741] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:3dddb628-1b36-4865-82fe-2cb3f5410e26)
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.6742] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.6777] manager[0x55a0ba5b5000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 31 00:37:05 np0005603500 systemd[1]: Starting Hostname Service...
Jan 31 00:37:05 np0005603500 systemd[1]: Started Hostname Service.
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7456] hostname: hostname: using hostnamed
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7459] hostname: static hostname changed from (none) to "np0005603500.novalocal"
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7467] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7473] manager[0x55a0ba5b5000]: rfkill: Wi-Fi hardware radio set enabled
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7473] manager[0x55a0ba5b5000]: rfkill: WWAN hardware radio set enabled
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7513] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7514] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7515] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7516] manager: Networking is enabled by state file
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7519] settings: Loaded settings plugin: keyfile (internal)
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7524] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7562] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7574] dhcp: init: Using DHCP client 'internal'
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7579] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7587] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7594] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7605] device (lo): Activation: starting connection 'lo' (5b1c2e8d-86e8-4488-ae63-5b9b7ed60e29)
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7613] device (eth0): carrier: link connected
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7619] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7627] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7627] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7636] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7646] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7653] device (eth1): carrier: link connected
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7658] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7666] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (0ae45ab3-5e67-3c3b-898d-4a6bebc55a91) (indicated)
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7667] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7674] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7688] device (eth1): Activation: starting connection 'Wired connection 1' (0ae45ab3-5e67-3c3b-898d-4a6bebc55a91)
Jan 31 00:37:05 np0005603500 systemd[1]: Started Network Manager.
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7700] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7709] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7713] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7717] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7721] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7727] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7732] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7736] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7742] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7754] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7760] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7774] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7780] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7807] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7814] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7824] device (lo): Activation: successful, device activated.
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7833] dhcp4 (eth0): state changed new lease, address=38.102.83.9
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.7841] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 31 00:37:05 np0005603500 systemd[1]: Starting Network Manager Wait Online...
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.8787] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.8802] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.8803] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.8806] manager: NetworkManager state is now CONNECTED_SITE
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.8808] device (eth0): Activation: successful, device activated.
Jan 31 00:37:05 np0005603500 NetworkManager[7198]: <info>  [1769837825.8812] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 31 00:37:06 np0005603500 python3[7266]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-aa1f-a873-0000000000a7-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 00:37:15 np0005603500 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 31 00:37:35 np0005603500 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 31 00:37:50 np0005603500 NetworkManager[7198]: <info>  [1769837870.6490] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 31 00:37:50 np0005603500 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 31 00:37:50 np0005603500 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 31 00:37:50 np0005603500 NetworkManager[7198]: <info>  [1769837870.6762] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 31 00:37:50 np0005603500 NetworkManager[7198]: <info>  [1769837870.6765] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 31 00:37:50 np0005603500 NetworkManager[7198]: <info>  [1769837870.6774] device (eth1): Activation: successful, device activated.
Jan 31 00:37:50 np0005603500 NetworkManager[7198]: <info>  [1769837870.6784] manager: startup complete
Jan 31 00:37:50 np0005603500 NetworkManager[7198]: <info>  [1769837870.6787] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 31 00:37:50 np0005603500 NetworkManager[7198]: <warn>  [1769837870.6799] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 31 00:37:50 np0005603500 NetworkManager[7198]: <info>  [1769837870.6814] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 31 00:37:50 np0005603500 systemd[1]: Finished Network Manager Wait Online.
Jan 31 00:37:50 np0005603500 NetworkManager[7198]: <info>  [1769837870.6946] dhcp4 (eth1): canceled DHCP transaction
Jan 31 00:37:50 np0005603500 NetworkManager[7198]: <info>  [1769837870.6947] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 31 00:37:50 np0005603500 NetworkManager[7198]: <info>  [1769837870.6947] dhcp4 (eth1): state changed no lease
Jan 31 00:37:50 np0005603500 NetworkManager[7198]: <info>  [1769837870.6970] policy: auto-activating connection 'ci-private-network' (8c0b1a26-8ccb-5a0f-94f9-24f522105883)
Jan 31 00:37:50 np0005603500 NetworkManager[7198]: <info>  [1769837870.6978] device (eth1): Activation: starting connection 'ci-private-network' (8c0b1a26-8ccb-5a0f-94f9-24f522105883)
Jan 31 00:37:50 np0005603500 NetworkManager[7198]: <info>  [1769837870.6980] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 00:37:50 np0005603500 NetworkManager[7198]: <info>  [1769837870.6984] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 00:37:50 np0005603500 NetworkManager[7198]: <info>  [1769837870.6996] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 00:37:50 np0005603500 NetworkManager[7198]: <info>  [1769837870.7010] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 00:37:50 np0005603500 NetworkManager[7198]: <info>  [1769837870.7955] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 00:37:50 np0005603500 NetworkManager[7198]: <info>  [1769837870.7957] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 00:37:50 np0005603500 NetworkManager[7198]: <info>  [1769837870.7960] device (eth1): Activation: successful, device activated.
Jan 31 00:37:58 np0005603500 python3[7371]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 00:37:58 np0005603500 python3[7444]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769837877.7633069-259-117212677836893/source _original_basename=tmpgunup6ch follow=False checksum=f86ccc9118e366faa0cfb133175a22165dcf43bb backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:38:00 np0005603500 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 31 00:38:16 np0005603500 systemd[4307]: Starting Mark boot as successful...
Jan 31 00:38:16 np0005603500 systemd[4307]: Finished Mark boot as successful.
Jan 31 00:38:58 np0005603500 systemd-logind[821]: Session 1 logged out. Waiting for processes to exit.
Jan 31 00:41:16 np0005603500 systemd[4307]: Created slice User Background Tasks Slice.
Jan 31 00:41:16 np0005603500 systemd[4307]: Starting Cleanup of User's Temporary Files and Directories...
Jan 31 00:41:16 np0005603500 systemd[4307]: Finished Cleanup of User's Temporary Files and Directories.
Jan 31 00:45:47 np0005603500 systemd-logind[821]: New session 3 of user zuul.
Jan 31 00:45:47 np0005603500 systemd[1]: Started Session 3 of User zuul.
Jan 31 00:45:47 np0005603500 python3[7503]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163efc-24cc-90cf-44d6-00000000216f-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 00:45:48 np0005603500 python3[7531]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:45:48 np0005603500 python3[7557]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:45:48 np0005603500 python3[7584]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:45:49 np0005603500 python3[7610]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:45:49 np0005603500 python3[7636]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:45:50 np0005603500 python3[7714]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 00:45:50 np0005603500 python3[7787]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769838349.7042632-499-88482526900799/source _original_basename=tmphicueisf follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:45:51 np0005603500 python3[7838]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 00:45:51 np0005603500 systemd[1]: Reloading.
Jan 31 00:45:51 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 00:45:52 np0005603500 python3[7894]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 31 00:45:53 np0005603500 python3[7920]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 00:45:53 np0005603500 python3[7948]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 00:45:53 np0005603500 python3[7976]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 00:45:54 np0005603500 python3[8004]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 00:45:54 np0005603500 python3[8031]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163efc-24cc-90cf-44d6-000000002176-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 00:45:55 np0005603500 python3[8061]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 31 00:45:57 np0005603500 systemd-logind[821]: Session 3 logged out. Waiting for processes to exit.
Jan 31 00:45:57 np0005603500 systemd[1]: session-3.scope: Deactivated successfully.
Jan 31 00:45:57 np0005603500 systemd[1]: session-3.scope: Consumed 3.822s CPU time.
Jan 31 00:45:57 np0005603500 systemd-logind[821]: Removed session 3.
Jan 31 00:45:59 np0005603500 systemd-logind[821]: New session 4 of user zuul.
Jan 31 00:45:59 np0005603500 systemd[1]: Started Session 4 of User zuul.
Jan 31 00:45:59 np0005603500 python3[8096]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 31 00:46:10 np0005603500 setsebool[8139]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 31 00:46:10 np0005603500 setsebool[8139]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 31 00:46:20 np0005603500 kernel: SELinux:  Converting 385 SID table entries...
Jan 31 00:46:20 np0005603500 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 00:46:20 np0005603500 kernel: SELinux:  policy capability open_perms=1
Jan 31 00:46:20 np0005603500 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 00:46:20 np0005603500 kernel: SELinux:  policy capability always_check_network=0
Jan 31 00:46:20 np0005603500 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 00:46:20 np0005603500 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 00:46:20 np0005603500 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 00:46:30 np0005603500 kernel: SELinux:  Converting 388 SID table entries...
Jan 31 00:46:30 np0005603500 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 00:46:30 np0005603500 kernel: SELinux:  policy capability open_perms=1
Jan 31 00:46:30 np0005603500 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 00:46:30 np0005603500 kernel: SELinux:  policy capability always_check_network=0
Jan 31 00:46:30 np0005603500 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 00:46:30 np0005603500 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 00:46:30 np0005603500 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 00:46:50 np0005603500 dbus-broker-launch[801]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 31 00:46:50 np0005603500 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 00:46:50 np0005603500 systemd[1]: Starting man-db-cache-update.service...
Jan 31 00:46:50 np0005603500 systemd[1]: Reloading.
Jan 31 00:46:50 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 00:46:50 np0005603500 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 00:46:54 np0005603500 python3[10406]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163efc-24cc-abb8-e0ce-00000000000a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 00:46:55 np0005603500 kernel: evm: overlay not supported
Jan 31 00:46:55 np0005603500 systemd[4307]: Starting D-Bus User Message Bus...
Jan 31 00:46:55 np0005603500 dbus-broker-launch[11451]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 31 00:46:55 np0005603500 dbus-broker-launch[11451]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 31 00:46:55 np0005603500 systemd[4307]: Started D-Bus User Message Bus.
Jan 31 00:46:55 np0005603500 dbus-broker-lau[11451]: Ready
Jan 31 00:46:55 np0005603500 systemd[4307]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 31 00:46:55 np0005603500 systemd[4307]: Created slice Slice /user.
Jan 31 00:46:55 np0005603500 systemd[4307]: podman-11141.scope: unit configures an IP firewall, but not running as root.
Jan 31 00:46:55 np0005603500 systemd[4307]: (This warning is only shown for the first unit using IP firewalling.)
Jan 31 00:46:55 np0005603500 systemd[4307]: Started podman-11141.scope.
Jan 31 00:46:56 np0005603500 systemd[4307]: Started podman-pause-d8ee46ad.scope.
Jan 31 00:46:56 np0005603500 systemd[1]: session-4.scope: Deactivated successfully.
Jan 31 00:46:56 np0005603500 systemd[1]: session-4.scope: Consumed 41.196s CPU time.
Jan 31 00:46:56 np0005603500 systemd-logind[821]: Session 4 logged out. Waiting for processes to exit.
Jan 31 00:46:56 np0005603500 systemd-logind[821]: Removed session 4.
Jan 31 00:46:56 np0005603500 irqbalance[816]: Cannot change IRQ 27 affinity: Operation not permitted
Jan 31 00:46:56 np0005603500 irqbalance[816]: IRQ 27 affinity is now unmanaged
Jan 31 00:47:15 np0005603500 systemd-logind[821]: New session 5 of user zuul.
Jan 31 00:47:15 np0005603500 systemd[1]: Started Session 5 of User zuul.
Jan 31 00:47:16 np0005603500 python3[19936]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMukReDuFaK1gQ4eRyJAcYJXybcr9Gp6nLsQTX8wveAexEdn2QkZpX8zoO9voG9ufhox+oxucDeKTnGopbcmLAM= zuul@np0005603499.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:47:16 np0005603500 python3[20173]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMukReDuFaK1gQ4eRyJAcYJXybcr9Gp6nLsQTX8wveAexEdn2QkZpX8zoO9voG9ufhox+oxucDeKTnGopbcmLAM= zuul@np0005603499.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:47:17 np0005603500 python3[20596]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005603500.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 31 00:47:18 np0005603500 python3[20949]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMukReDuFaK1gQ4eRyJAcYJXybcr9Gp6nLsQTX8wveAexEdn2QkZpX8zoO9voG9ufhox+oxucDeKTnGopbcmLAM= zuul@np0005603499.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 00:47:18 np0005603500 python3[21221]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 00:47:19 np0005603500 python3[21549]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769838438.4707446-118-15733832601982/source _original_basename=tmpndmv0zvn follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:47:19 np0005603500 python3[21900]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Jan 31 00:47:20 np0005603500 systemd[1]: Starting Hostname Service...
Jan 31 00:47:20 np0005603500 systemd[1]: Started Hostname Service.
Jan 31 00:47:20 np0005603500 systemd-hostnamed[22032]: Changed pretty hostname to 'compute-0'
Jan 31 00:47:20 np0005603500 systemd-hostnamed[22032]: Hostname set to <compute-0> (static)
Jan 31 00:47:20 np0005603500 NetworkManager[7198]: <info>  [1769838440.1319] hostname: static hostname changed from "np0005603500.novalocal" to "compute-0"
Jan 31 00:47:20 np0005603500 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 31 00:47:20 np0005603500 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 31 00:47:20 np0005603500 systemd[1]: session-5.scope: Deactivated successfully.
Jan 31 00:47:20 np0005603500 systemd[1]: session-5.scope: Consumed 2.078s CPU time.
Jan 31 00:47:20 np0005603500 systemd-logind[821]: Session 5 logged out. Waiting for processes to exit.
Jan 31 00:47:20 np0005603500 systemd-logind[821]: Removed session 5.
Jan 31 00:47:30 np0005603500 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 31 00:47:38 np0005603500 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 00:47:38 np0005603500 systemd[1]: Finished man-db-cache-update.service.
Jan 31 00:47:38 np0005603500 systemd[1]: man-db-cache-update.service: Consumed 44.813s CPU time.
Jan 31 00:47:38 np0005603500 systemd[1]: run-rda7c450cb44d4b41bf23ffbc22b119d8.service: Deactivated successfully.
Jan 31 00:47:50 np0005603500 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 31 00:50:16 np0005603500 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 31 00:50:16 np0005603500 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 31 00:50:16 np0005603500 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 31 00:50:16 np0005603500 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 31 00:51:02 np0005603500 systemd-logind[821]: New session 6 of user zuul.
Jan 31 00:51:02 np0005603500 systemd[1]: Started Session 6 of User zuul.
Jan 31 00:51:03 np0005603500 python3[30038]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 00:51:04 np0005603500 python3[30154]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 00:51:05 np0005603500 python3[30227]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769838664.4461892-33594-11352227922323/source mode=0755 _original_basename=delorean.repo follow=False checksum=a8b70374c78835be4469ae6662980594186ef9a7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:51:05 np0005603500 python3[30253]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-master-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 00:51:05 np0005603500 python3[30326]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769838664.4461892-33594-11352227922323/source mode=0755 _original_basename=delorean-master-testing.repo follow=False checksum=2c5ad31b3cd5c5b96a9995d83e342833f9bd7020 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:51:06 np0005603500 python3[30352]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 00:51:06 np0005603500 python3[30425]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769838664.4461892-33594-11352227922323/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:51:06 np0005603500 python3[30451]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 00:51:07 np0005603500 python3[30524]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769838664.4461892-33594-11352227922323/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:51:07 np0005603500 python3[30550]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 00:51:07 np0005603500 python3[30623]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769838664.4461892-33594-11352227922323/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:51:07 np0005603500 python3[30649]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 00:51:08 np0005603500 python3[30722]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769838664.4461892-33594-11352227922323/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:51:08 np0005603500 python3[30748]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 00:51:08 np0005603500 python3[30821]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769838664.4461892-33594-11352227922323/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=0ed36b5298fddc97283956522b2e059238671e05 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 00:51:16 np0005603500 python3[30879]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 00:56:16 np0005603500 systemd[1]: session-6.scope: Deactivated successfully.
Jan 31 00:56:16 np0005603500 systemd[1]: session-6.scope: Consumed 4.533s CPU time.
Jan 31 00:56:16 np0005603500 systemd-logind[821]: Session 6 logged out. Waiting for processes to exit.
Jan 31 00:56:16 np0005603500 systemd-logind[821]: Removed session 6.
Jan 31 01:06:28 np0005603500 systemd-logind[821]: New session 7 of user zuul.
Jan 31 01:06:28 np0005603500 systemd[1]: Started Session 7 of User zuul.
Jan 31 01:06:29 np0005603500 python3.9[31054]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:06:30 np0005603500 python3.9[31235]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:06:41 np0005603500 systemd[1]: session-7.scope: Deactivated successfully.
Jan 31 01:06:41 np0005603500 systemd[1]: session-7.scope: Consumed 7.725s CPU time.
Jan 31 01:06:41 np0005603500 systemd-logind[821]: Session 7 logged out. Waiting for processes to exit.
Jan 31 01:06:41 np0005603500 systemd-logind[821]: Removed session 7.
Jan 31 01:06:47 np0005603500 systemd-logind[821]: New session 8 of user zuul.
Jan 31 01:06:47 np0005603500 systemd[1]: Started Session 8 of User zuul.
Jan 31 01:06:48 np0005603500 python3.9[31446]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:06:48 np0005603500 systemd[1]: session-8.scope: Deactivated successfully.
Jan 31 01:06:48 np0005603500 systemd-logind[821]: Session 8 logged out. Waiting for processes to exit.
Jan 31 01:06:48 np0005603500 systemd-logind[821]: Removed session 8.
Jan 31 01:07:06 np0005603500 systemd-logind[821]: New session 9 of user zuul.
Jan 31 01:07:06 np0005603500 systemd[1]: Started Session 9 of User zuul.
Jan 31 01:07:06 np0005603500 python3.9[31627]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 31 01:07:07 np0005603500 python3.9[31801]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:07:08 np0005603500 python3.9[31953]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:07:09 np0005603500 python3.9[32106]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:07:10 np0005603500 python3.9[32258]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:07:10 np0005603500 python3.9[32410]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:07:11 np0005603500 python3.9[32533]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769839630.5468585-68-210107026723635/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:07:12 np0005603500 python3.9[32685]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:07:13 np0005603500 python3.9[32841]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:07:13 np0005603500 python3.9[32993]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:07:14 np0005603500 python3.9[33143]: ansible-ansible.builtin.service_facts Invoked
Jan 31 01:07:16 np0005603500 python3.9[33396]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:07:17 np0005603500 python3.9[33546]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:07:17 np0005603500 irqbalance[816]: Cannot change IRQ 26 affinity: Operation not permitted
Jan 31 01:07:17 np0005603500 irqbalance[816]: IRQ 26 affinity is now unmanaged
Jan 31 01:07:18 np0005603500 python3.9[33700]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:07:19 np0005603500 python3.9[33858]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 01:07:20 np0005603500 python3.9[33942]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 01:08:12 np0005603500 systemd[1]: Reloading.
Jan 31 01:08:12 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:08:12 np0005603500 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 31 01:08:14 np0005603500 systemd[1]: Reloading.
Jan 31 01:08:14 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:08:14 np0005603500 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 31 01:08:14 np0005603500 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 31 01:08:14 np0005603500 systemd[1]: Reloading.
Jan 31 01:08:14 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:08:14 np0005603500 systemd[1]: Starting dnf makecache...
Jan 31 01:08:14 np0005603500 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 31 01:08:14 np0005603500 dnf[34234]: Failed determining last makecache time.
Jan 31 01:08:14 np0005603500 dnf[34234]: delorean-openstack-barbican-42b4c41831408a8e323 175 kB/s | 3.0 kB     00:00
Jan 31 01:08:14 np0005603500 dnf[34234]: delorean-python-glean-642fffe0203a8ffcc2443db52 204 kB/s | 3.0 kB     00:00
Jan 31 01:08:14 np0005603500 dnf[34234]: delorean-openstack-cinder-1c00d6490d88e436f26ef 201 kB/s | 3.0 kB     00:00
Jan 31 01:08:14 np0005603500 dnf[34234]: delorean-python-stevedore-c4acc5639fd2329372142 201 kB/s | 3.0 kB     00:00
Jan 31 01:08:14 np0005603500 dnf[34234]: delorean-python-cloudkitty-tests-tempest-783703 193 kB/s | 3.0 kB     00:00
Jan 31 01:08:14 np0005603500 dnf[34234]: delorean-diskimage-builder-61b717cc45660834fe9a 195 kB/s | 3.0 kB     00:00
Jan 31 01:08:14 np0005603500 dnf[34234]: delorean-openstack-nova-eaa65f0b85123a4ee343246 203 kB/s | 3.0 kB     00:00
Jan 31 01:08:15 np0005603500 dnf[34234]: delorean-python-designate-tests-tempest-347fdbc 156 kB/s | 3.0 kB     00:00
Jan 31 01:08:15 np0005603500 dnf[34234]: delorean-openstack-glance-1fd12c29b339f30fe823e 189 kB/s | 3.0 kB     00:00
Jan 31 01:08:15 np0005603500 dnf[34234]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 180 kB/s | 3.0 kB     00:00
Jan 31 01:08:15 np0005603500 dnf[34234]: delorean-openstack-manila-d783d10e75495b73866db 174 kB/s | 3.0 kB     00:00
Jan 31 01:08:15 np0005603500 dnf[34234]: delorean-openstack-neutron-95cadbd379667c8520c8 197 kB/s | 3.0 kB     00:00
Jan 31 01:08:15 np0005603500 dnf[34234]: delorean-openstack-octavia-5975097dd4b021385178 187 kB/s | 3.0 kB     00:00
Jan 31 01:08:15 np0005603500 dnf[34234]: delorean-openstack-watcher-c014f81a8647287f6dcc 182 kB/s | 3.0 kB     00:00
Jan 31 01:08:15 np0005603500 dnf[34234]: delorean-python-tcib-78032d201b02cee27e8e644c61 183 kB/s | 3.0 kB     00:00
Jan 31 01:08:15 np0005603500 dnf[34234]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 182 kB/s | 3.0 kB     00:00
Jan 31 01:08:15 np0005603500 dnf[34234]: delorean-openstack-swift-dc98a8463506ac520c469a 179 kB/s | 3.0 kB     00:00
Jan 31 01:08:15 np0005603500 dnf[34234]: delorean-python-tempestconf-8515371b7cceebd4282 189 kB/s | 3.0 kB     00:00
Jan 31 01:08:15 np0005603500 dnf[34234]: delorean-openstack-heat-ui-013accbfd179753bc3f0 185 kB/s | 3.0 kB     00:00
Jan 31 01:08:15 np0005603500 dbus-broker-launch[789]: Noticed file-system modification, trigger reload.
Jan 31 01:08:15 np0005603500 dbus-broker-launch[789]: Noticed file-system modification, trigger reload.
Jan 31 01:08:15 np0005603500 dbus-broker-launch[789]: Noticed file-system modification, trigger reload.
Jan 31 01:08:15 np0005603500 dnf[34234]: CentOS Stream 9 - BaseOS                         56 kB/s | 6.1 kB     00:00
Jan 31 01:08:15 np0005603500 dnf[34234]: CentOS Stream 9 - AppStream                      28 kB/s | 6.5 kB     00:00
Jan 31 01:08:17 np0005603500 dnf[34234]: CentOS Stream 9 - CRB                           4.4 kB/s | 6.0 kB     00:01
Jan 31 01:08:17 np0005603500 dnf[34234]: CentOS Stream 9 - Extras packages                62 kB/s | 7.3 kB     00:00
Jan 31 01:08:17 np0005603500 dnf[34234]: dlrn-antelope-testing                           179 kB/s | 3.0 kB     00:00
Jan 31 01:08:17 np0005603500 dnf[34234]: dlrn-antelope-build-deps                        156 kB/s | 3.0 kB     00:00
Jan 31 01:08:17 np0005603500 dnf[34234]: centos9-rabbitmq                                108 kB/s | 3.0 kB     00:00
Jan 31 01:08:17 np0005603500 dnf[34234]: centos9-storage                                 117 kB/s | 3.0 kB     00:00
Jan 31 01:08:17 np0005603500 dnf[34234]: centos9-opstools                                136 kB/s | 3.0 kB     00:00
Jan 31 01:08:17 np0005603500 dnf[34234]: NFV SIG OpenvSwitch                             135 kB/s | 3.0 kB     00:00
Jan 31 01:08:17 np0005603500 dnf[34234]: repo-setup-centos-appstream                     165 kB/s | 4.4 kB     00:00
Jan 31 01:08:17 np0005603500 dnf[34234]: repo-setup-centos-baseos                        115 kB/s | 3.9 kB     00:00
Jan 31 01:08:17 np0005603500 dnf[34234]: repo-setup-centos-highavailability               16 kB/s | 3.9 kB     00:00
Jan 31 01:08:17 np0005603500 dnf[34234]: repo-setup-centos-powertools                     65 kB/s | 4.3 kB     00:00
Jan 31 01:08:18 np0005603500 dnf[34234]: Extra Packages for Enterprise Linux 9 - x86_64  103 kB/s |  31 kB     00:00
Jan 31 01:08:18 np0005603500 dnf[34234]: Metadata cache created.
Jan 31 01:08:19 np0005603500 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 31 01:08:19 np0005603500 systemd[1]: Finished dnf makecache.
Jan 31 01:08:19 np0005603500 systemd[1]: dnf-makecache.service: Consumed 1.712s CPU time.
Jan 31 01:09:50 np0005603500 kernel: SELinux:  Converting 2726 SID table entries...
Jan 31 01:09:50 np0005603500 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 01:09:50 np0005603500 kernel: SELinux:  policy capability open_perms=1
Jan 31 01:09:50 np0005603500 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 01:09:50 np0005603500 kernel: SELinux:  policy capability always_check_network=0
Jan 31 01:09:50 np0005603500 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 01:09:50 np0005603500 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 01:09:50 np0005603500 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 01:09:50 np0005603500 dbus-broker-launch[801]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 31 01:09:50 np0005603500 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 01:09:50 np0005603500 systemd[1]: Starting man-db-cache-update.service...
Jan 31 01:09:50 np0005603500 systemd[1]: Reloading.
Jan 31 01:09:51 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:09:51 np0005603500 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 01:09:53 np0005603500 python3.9[35498]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:09:54 np0005603500 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 01:09:54 np0005603500 systemd[1]: Finished man-db-cache-update.service.
Jan 31 01:09:54 np0005603500 systemd[1]: man-db-cache-update.service: Consumed 1.001s CPU time.
Jan 31 01:09:54 np0005603500 systemd[1]: run-r0e0a4a23b8a74d179ee0161d583bbabf.service: Deactivated successfully.
Jan 31 01:09:56 np0005603500 python3.9[35780]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 31 01:09:56 np0005603500 python3.9[35932]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 31 01:10:00 np0005603500 python3.9[36086]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:10:01 np0005603500 python3.9[36238]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 31 01:10:02 np0005603500 python3.9[36390]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:10:02 np0005603500 python3.9[36542]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:10:06 np0005603500 python3.9[36665]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769839802.3905666-231-129168621204481/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21080916ed436367f9d55ee0e8f00e696634482f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:10:07 np0005603500 python3.9[36817]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:10:07 np0005603500 python3.9[36969]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:10:08 np0005603500 python3.9[37122]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:10:09 np0005603500 python3.9[37274]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 31 01:10:09 np0005603500 python3.9[37427]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 31 01:10:09 np0005603500 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 01:10:10 np0005603500 python3.9[37586]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 31 01:10:11 np0005603500 python3.9[37746]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 31 01:10:12 np0005603500 python3.9[37899]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 31 01:10:13 np0005603500 python3.9[38057]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 31 01:10:13 np0005603500 python3.9[38209]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 01:10:18 np0005603500 python3.9[38363]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:10:18 np0005603500 python3.9[38515]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:10:19 np0005603500 python3.9[38638]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769839818.2423723-350-102648678547102/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:10:20 np0005603500 python3.9[38790]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 01:10:20 np0005603500 systemd[1]: Starting Load Kernel Modules...
Jan 31 01:10:20 np0005603500 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 31 01:10:20 np0005603500 kernel: Bridge firewalling registered
Jan 31 01:10:20 np0005603500 systemd-modules-load[38794]: Inserted module 'br_netfilter'
Jan 31 01:10:20 np0005603500 systemd[1]: Finished Load Kernel Modules.
Jan 31 01:10:21 np0005603500 python3.9[38950]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:10:21 np0005603500 python3.9[39073]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769839820.6899798-373-202238977654882/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:10:22 np0005603500 python3.9[39225]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 01:10:26 np0005603500 dbus-broker-launch[789]: Noticed file-system modification, trigger reload.
Jan 31 01:10:26 np0005603500 dbus-broker-launch[789]: Noticed file-system modification, trigger reload.
Jan 31 01:10:27 np0005603500 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 01:10:27 np0005603500 systemd[1]: Starting man-db-cache-update.service...
Jan 31 01:10:27 np0005603500 systemd[1]: Reloading.
Jan 31 01:10:27 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:10:27 np0005603500 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 01:10:29 np0005603500 python3.9[41147]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:10:30 np0005603500 python3.9[42416]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 31 01:10:30 np0005603500 python3.9[43289]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:10:31 np0005603500 python3.9[43441]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:10:31 np0005603500 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 31 01:10:32 np0005603500 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 01:10:32 np0005603500 systemd[1]: Finished man-db-cache-update.service.
Jan 31 01:10:32 np0005603500 systemd[1]: man-db-cache-update.service: Consumed 3.452s CPU time.
Jan 31 01:10:32 np0005603500 systemd[1]: run-rf4a14bed28a642dca508b26243e26916.service: Deactivated successfully.
Jan 31 01:10:32 np0005603500 systemd[1]: Starting Authorization Manager...
Jan 31 01:10:32 np0005603500 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 31 01:10:32 np0005603500 polkitd[43659]: Started polkitd version 0.117
Jan 31 01:10:32 np0005603500 systemd[1]: Started Authorization Manager.
Jan 31 01:10:33 np0005603500 python3.9[43829]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:10:33 np0005603500 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 31 01:10:33 np0005603500 systemd[1]: tuned.service: Deactivated successfully.
Jan 31 01:10:33 np0005603500 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 31 01:10:33 np0005603500 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 31 01:10:33 np0005603500 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 31 01:10:34 np0005603500 python3.9[43990]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 31 01:10:36 np0005603500 python3.9[44142]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:10:37 np0005603500 systemd[1]: Reloading.
Jan 31 01:10:37 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:10:37 np0005603500 python3.9[44332]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:10:37 np0005603500 systemd[1]: Reloading.
Jan 31 01:10:37 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:10:38 np0005603500 python3.9[44520]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:10:39 np0005603500 python3.9[44673]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:10:39 np0005603500 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 31 01:10:39 np0005603500 python3.9[44826]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:10:41 np0005603500 python3.9[44988]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:10:42 np0005603500 python3.9[45141]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 01:10:42 np0005603500 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 31 01:10:42 np0005603500 systemd[1]: Stopped Apply Kernel Variables.
Jan 31 01:10:42 np0005603500 systemd[1]: Stopping Apply Kernel Variables...
Jan 31 01:10:42 np0005603500 systemd[1]: Starting Apply Kernel Variables...
Jan 31 01:10:42 np0005603500 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 31 01:10:42 np0005603500 systemd[1]: Finished Apply Kernel Variables.
Jan 31 01:10:43 np0005603500 systemd[1]: session-9.scope: Deactivated successfully.
Jan 31 01:10:43 np0005603500 systemd[1]: session-9.scope: Consumed 2min 4.584s CPU time.
Jan 31 01:10:43 np0005603500 systemd-logind[821]: Session 9 logged out. Waiting for processes to exit.
Jan 31 01:10:43 np0005603500 systemd-logind[821]: Removed session 9.
Jan 31 01:10:52 np0005603500 systemd-logind[821]: New session 10 of user zuul.
Jan 31 01:10:52 np0005603500 systemd[1]: Started Session 10 of User zuul.
Jan 31 01:10:53 np0005603500 python3.9[45324]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:10:54 np0005603500 python3.9[45478]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:10:55 np0005603500 python3.9[45634]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:10:56 np0005603500 python3.9[45785]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:10:56 np0005603500 python3.9[45941]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 01:10:57 np0005603500 python3.9[46025]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 01:10:59 np0005603500 python3.9[46178]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 01:11:00 np0005603500 python3.9[46349]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:11:01 np0005603500 python3.9[46501]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:11:02 np0005603500 podman[46502]: 2026-01-31 06:11:02.052314844 +0000 UTC m=+0.621711752 system refresh
Jan 31 01:11:02 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:11:02 np0005603500 python3.9[46666]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:11:03 np0005603500 python3.9[46789]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769839862.2235339-104-266563612108768/.source.json follow=False _original_basename=podman_network_config.j2 checksum=de257c1fdb42a7d301723092127100abb05655ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:11:04 np0005603500 python3.9[46941]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:11:04 np0005603500 python3.9[47064]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769839863.5525053-119-259722423012588/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:11:05 np0005603500 python3.9[47216]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:11:06 np0005603500 python3.9[47368]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:11:06 np0005603500 python3.9[47520]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:11:07 np0005603500 python3.9[47672]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:11:08 np0005603500 python3.9[47822]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:11:08 np0005603500 python3.9[47976]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 31 01:11:11 np0005603500 python3.9[48129]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 31 01:11:14 np0005603500 python3.9[48290]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 31 01:11:16 np0005603500 python3.9[48443]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 31 01:11:18 np0005603500 python3.9[48596]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 31 01:11:21 np0005603500 python3.9[48752]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 31 01:11:25 np0005603500 python3.9[48921]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 31 01:11:27 np0005603500 python3.9[49074]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 31 01:11:45 np0005603500 python3.9[49410]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 31 01:11:47 np0005603500 python3.9[49566]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 31 01:11:49 np0005603500 python3.9[49723]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:11:49 np0005603500 python3.9[49898]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:11:50 np0005603500 python3.9[50021]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769839909.5202703-277-266185347025043/.source.json _original_basename=.wz49pymq follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:11:51 np0005603500 python3.9[50173]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 31 01:11:51 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:11:58 np0005603500 systemd[1]: var-lib-containers-storage-overlay-compat1731318188-lower\x2dmapped.mount: Deactivated successfully.
Jan 31 01:12:01 np0005603500 podman[50184]: 2026-01-31 06:12:01.247357335 +0000 UTC m=+9.847957818 image pull 41feda93bd9d79a05e844f418942ad169ecdce2acd7eff3ec0131c566040529d quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:12:01 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:12:01 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:12:01 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:12:02 np0005603500 python3.9[50480]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 31 01:12:16 np0005603500 podman[50492]: 2026-01-31 06:12:16.688177744 +0000 UTC m=+14.499057226 image pull d52ce0b189025039ce86fc9564595bcce243e95c598f912f021ea09cd4116a16 quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:12:16 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:12:16 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:12:16 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:12:17 np0005603500 python3.9[50812]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 31 01:12:36 np0005603500 podman[50825]: 2026-01-31 06:12:36.985940061 +0000 UTC m=+19.312202681 image pull 9464506e605c3736f039205df9460679aa5b9e23fa6c2ca013e2f0c1365f627e quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:12:36 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:12:37 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:12:37 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:12:37 np0005603500 python3.9[51080]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 31 01:12:43 np0005603500 podman[51092]: 2026-01-31 06:12:43.131574619 +0000 UTC m=+5.175736130 image pull ccafe36535c9326f773051911bf7e736f46f05eea29aaa728ad791f05a9c5d70 quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:12:43 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:12:43 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:12:43 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:12:43 np0005603500 python3.9[51346]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 31 01:12:45 np0005603500 podman[51358]: 2026-01-31 06:12:45.048279889 +0000 UTC m=+1.167363286 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Jan 31 01:12:45 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:12:45 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:12:45 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:12:45 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:12:45 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:12:45 np0005603500 systemd[1]: session-10.scope: Deactivated successfully.
Jan 31 01:12:45 np0005603500 systemd[1]: session-10.scope: Consumed 1min 59.769s CPU time.
Jan 31 01:12:45 np0005603500 systemd-logind[821]: Session 10 logged out. Waiting for processes to exit.
Jan 31 01:12:45 np0005603500 systemd-logind[821]: Removed session 10.
Jan 31 01:12:52 np0005603500 systemd-logind[821]: New session 11 of user zuul.
Jan 31 01:12:52 np0005603500 systemd[1]: Started Session 11 of User zuul.
Jan 31 01:12:53 np0005603500 python3.9[51656]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:12:54 np0005603500 python3.9[51812]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 31 01:12:55 np0005603500 python3.9[51965]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 31 01:12:56 np0005603500 python3.9[52123]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 31 01:12:57 np0005603500 python3.9[52283]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 01:12:58 np0005603500 python3.9[52367]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 31 01:13:05 np0005603500 python3.9[52528]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 01:13:17 np0005603500 kernel: SELinux:  Converting 2739 SID table entries...
Jan 31 01:13:17 np0005603500 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 01:13:17 np0005603500 kernel: SELinux:  policy capability open_perms=1
Jan 31 01:13:17 np0005603500 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 01:13:17 np0005603500 kernel: SELinux:  policy capability always_check_network=0
Jan 31 01:13:17 np0005603500 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 01:13:17 np0005603500 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 01:13:17 np0005603500 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 01:13:17 np0005603500 dbus-broker-launch[801]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 31 01:13:17 np0005603500 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 31 01:13:18 np0005603500 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 01:13:18 np0005603500 systemd[1]: Starting man-db-cache-update.service...
Jan 31 01:13:18 np0005603500 systemd[1]: Reloading.
Jan 31 01:13:18 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:13:18 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:13:18 np0005603500 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 01:13:19 np0005603500 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 01:13:19 np0005603500 systemd[1]: Finished man-db-cache-update.service.
Jan 31 01:13:19 np0005603500 systemd[1]: run-ra5242218d6a046b1b3579173e2bc0354.service: Deactivated successfully.
Jan 31 01:13:20 np0005603500 python3.9[53627]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 01:13:20 np0005603500 systemd[1]: Reloading.
Jan 31 01:13:20 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:13:20 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:13:20 np0005603500 systemd[1]: Starting Open vSwitch Database Unit...
Jan 31 01:13:20 np0005603500 chown[53669]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 31 01:13:20 np0005603500 ovs-ctl[53674]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 31 01:13:21 np0005603500 ovs-ctl[53674]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 31 01:13:21 np0005603500 ovs-ctl[53674]: Starting ovsdb-server [  OK  ]
Jan 31 01:13:21 np0005603500 ovs-vsctl[53723]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 31 01:13:21 np0005603500 ovs-vsctl[53743]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"fe203bcd-9b71-4c38-9736-f063b4ce4137\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 31 01:13:21 np0005603500 ovs-ctl[53674]: Configuring Open vSwitch system IDs [  OK  ]
Jan 31 01:13:21 np0005603500 ovs-vsctl[53749]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Jan 31 01:13:21 np0005603500 ovs-ctl[53674]: Enabling remote OVSDB managers [  OK  ]
Jan 31 01:13:21 np0005603500 systemd[1]: Started Open vSwitch Database Unit.
Jan 31 01:13:21 np0005603500 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 31 01:13:21 np0005603500 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 31 01:13:21 np0005603500 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 31 01:13:21 np0005603500 kernel: openvswitch: Open vSwitch switching datapath
Jan 31 01:13:21 np0005603500 ovs-ctl[53793]: Inserting openvswitch module [  OK  ]
Jan 31 01:13:21 np0005603500 ovs-ctl[53762]: Starting ovs-vswitchd [  OK  ]
Jan 31 01:13:21 np0005603500 ovs-vsctl[53811]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Jan 31 01:13:21 np0005603500 ovs-ctl[53762]: Enabling remote OVSDB managers [  OK  ]
Jan 31 01:13:21 np0005603500 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 31 01:13:21 np0005603500 systemd[1]: Starting Open vSwitch...
Jan 31 01:13:21 np0005603500 systemd[1]: Finished Open vSwitch.
Jan 31 01:13:22 np0005603500 python3.9[53962]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:13:23 np0005603500 python3.9[54114]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 31 01:13:24 np0005603500 kernel: SELinux:  Converting 2753 SID table entries...
Jan 31 01:13:24 np0005603500 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 01:13:24 np0005603500 kernel: SELinux:  policy capability open_perms=1
Jan 31 01:13:24 np0005603500 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 01:13:24 np0005603500 kernel: SELinux:  policy capability always_check_network=0
Jan 31 01:13:24 np0005603500 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 01:13:24 np0005603500 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 01:13:24 np0005603500 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 01:13:24 np0005603500 python3.9[54269]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:13:25 np0005603500 dbus-broker-launch[801]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 31 01:13:25 np0005603500 python3.9[54427]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 01:13:27 np0005603500 python3.9[54580]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:13:29 np0005603500 python3.9[54867]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 31 01:13:29 np0005603500 python3.9[55017]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:13:30 np0005603500 python3.9[55171]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 01:13:34 np0005603500 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 01:13:34 np0005603500 systemd[1]: Starting man-db-cache-update.service...
Jan 31 01:13:34 np0005603500 systemd[1]: Reloading.
Jan 31 01:13:34 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:13:34 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:13:34 np0005603500 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 01:13:35 np0005603500 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 01:13:35 np0005603500 systemd[1]: Finished man-db-cache-update.service.
Jan 31 01:13:35 np0005603500 systemd[1]: run-r3f3bb7953d6c4181a8d3110e582c1989.service: Deactivated successfully.
Jan 31 01:13:35 np0005603500 python3.9[55488]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 01:13:35 np0005603500 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 31 01:13:35 np0005603500 systemd[1]: Stopped Network Manager Wait Online.
Jan 31 01:13:35 np0005603500 systemd[1]: Stopping Network Manager Wait Online...
Jan 31 01:13:35 np0005603500 systemd[1]: Stopping Network Manager...
Jan 31 01:13:35 np0005603500 NetworkManager[7198]: <info>  [1769840015.6071] caught SIGTERM, shutting down normally.
Jan 31 01:13:35 np0005603500 NetworkManager[7198]: <info>  [1769840015.6081] dhcp4 (eth0): canceled DHCP transaction
Jan 31 01:13:35 np0005603500 NetworkManager[7198]: <info>  [1769840015.6081] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 31 01:13:35 np0005603500 NetworkManager[7198]: <info>  [1769840015.6081] dhcp4 (eth0): state changed no lease
Jan 31 01:13:35 np0005603500 NetworkManager[7198]: <info>  [1769840015.6083] manager: NetworkManager state is now CONNECTED_SITE
Jan 31 01:13:35 np0005603500 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 31 01:13:35 np0005603500 NetworkManager[7198]: <info>  [1769840015.6317] exiting (success)
Jan 31 01:13:35 np0005603500 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 31 01:13:35 np0005603500 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 31 01:13:35 np0005603500 systemd[1]: Stopped Network Manager.
Jan 31 01:13:35 np0005603500 systemd[1]: NetworkManager.service: Consumed 18.485s CPU time, 4.3M memory peak, read 0B from disk, written 23.0K to disk.
Jan 31 01:13:35 np0005603500 systemd[1]: Starting Network Manager...
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.6981] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:3dddb628-1b36-4865-82fe-2cb3f5410e26)
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.6983] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7053] manager[0x55f1de560000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 31 01:13:35 np0005603500 systemd[1]: Starting Hostname Service...
Jan 31 01:13:35 np0005603500 systemd[1]: Started Hostname Service.
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7706] hostname: hostname: using hostnamed
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7707] hostname: static hostname changed from (none) to "compute-0"
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7711] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7715] manager[0x55f1de560000]: rfkill: Wi-Fi hardware radio set enabled
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7715] manager[0x55f1de560000]: rfkill: WWAN hardware radio set enabled
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7731] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7739] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7739] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7740] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7740] manager: Networking is enabled by state file
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7742] settings: Loaded settings plugin: keyfile (internal)
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7745] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7766] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7772] dhcp: init: Using DHCP client 'internal'
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7774] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7778] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7781] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7787] device (lo): Activation: starting connection 'lo' (5b1c2e8d-86e8-4488-ae63-5b9b7ed60e29)
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7791] device (eth0): carrier: link connected
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7794] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7797] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7797] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7801] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7807] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7811] device (eth1): carrier: link connected
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7814] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7818] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (8c0b1a26-8ccb-5a0f-94f9-24f522105883) (indicated)
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7818] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7822] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7827] device (eth1): Activation: starting connection 'ci-private-network' (8c0b1a26-8ccb-5a0f-94f9-24f522105883)
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7833] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 31 01:13:35 np0005603500 systemd[1]: Started Network Manager.
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7843] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7847] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7848] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7850] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7853] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7856] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7858] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7861] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7866] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7868] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7874] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7884] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7891] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7894] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7898] device (lo): Activation: successful, device activated.
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7904] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7907] dhcp4 (eth0): state changed new lease, address=38.102.83.9
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7910] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7913] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7915] device (eth1): Activation: successful, device activated.
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7926] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 31 01:13:35 np0005603500 systemd[1]: Starting Network Manager Wait Online...
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.7980] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.8004] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.8007] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.8010] manager: NetworkManager state is now CONNECTED_SITE
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.8012] device (eth0): Activation: successful, device activated.
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.8015] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 31 01:13:35 np0005603500 NetworkManager[55506]: <info>  [1769840015.8017] manager: startup complete
Jan 31 01:13:35 np0005603500 systemd[1]: Finished Network Manager Wait Online.
Jan 31 01:13:36 np0005603500 python3.9[55715]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 01:13:42 np0005603500 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 01:13:42 np0005603500 systemd[1]: Starting man-db-cache-update.service...
Jan 31 01:13:42 np0005603500 systemd[1]: Reloading.
Jan 31 01:13:42 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:13:42 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:13:42 np0005603500 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 01:13:44 np0005603500 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 01:13:44 np0005603500 systemd[1]: Finished man-db-cache-update.service.
Jan 31 01:13:44 np0005603500 systemd[1]: run-r1e454f317c234330918811ee0918cbc9.service: Deactivated successfully.
Jan 31 01:13:44 np0005603500 python3.9[56177]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:13:45 np0005603500 python3.9[56329]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:13:45 np0005603500 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 31 01:13:46 np0005603500 python3.9[56483]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:13:46 np0005603500 python3.9[56635]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:13:47 np0005603500 python3.9[56787]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:13:48 np0005603500 python3.9[56939]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:13:48 np0005603500 python3.9[57091]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:13:49 np0005603500 python3.9[57214]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840028.250628-224-139553735152350/.source _original_basename=.wpu0xr9q follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:13:50 np0005603500 python3.9[57366]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:13:50 np0005603500 python3.9[57518]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 31 01:13:51 np0005603500 python3.9[57670]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:13:53 np0005603500 python3.9[58097]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 31 01:13:54 np0005603500 ansible-async_wrapper.py[58272]: Invoked with j698970713324 300 /home/zuul/.ansible/tmp/ansible-tmp-1769840033.5612564-290-181461002184883/AnsiballZ_edpm_os_net_config.py _
Jan 31 01:13:54 np0005603500 ansible-async_wrapper.py[58275]: Starting module and watcher
Jan 31 01:13:54 np0005603500 ansible-async_wrapper.py[58275]: Start watching 58276 (300)
Jan 31 01:13:54 np0005603500 ansible-async_wrapper.py[58276]: Start module (58276)
Jan 31 01:13:54 np0005603500 ansible-async_wrapper.py[58272]: Return async_wrapper task started.
Jan 31 01:13:54 np0005603500 python3.9[58277]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 31 01:13:55 np0005603500 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 31 01:13:55 np0005603500 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 31 01:13:55 np0005603500 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 31 01:13:55 np0005603500 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 31 01:13:55 np0005603500 kernel: cfg80211: failed to load regulatory.db
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.2709] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58278 uid=0 result="success"
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.2729] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58278 uid=0 result="success"
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3373] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3375] audit: op="connection-add" uuid="d530fd1c-8505-4155-a9bf-a2c66df92831" name="br-ex-br" pid=58278 uid=0 result="success"
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3390] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3392] audit: op="connection-add" uuid="a98bc08e-0dc8-4ade-ae9b-ea3da6d6c0be" name="br-ex-port" pid=58278 uid=0 result="success"
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3402] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3403] audit: op="connection-add" uuid="fa693a01-73fd-4edf-a6ba-b2966defc47f" name="eth1-port" pid=58278 uid=0 result="success"
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3415] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3416] audit: op="connection-add" uuid="fefc2eff-2426-4575-95bc-23458dfa5b8c" name="vlan20-port" pid=58278 uid=0 result="success"
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3427] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3429] audit: op="connection-add" uuid="8025abf4-5f19-447d-b972-a52e513e1729" name="vlan21-port" pid=58278 uid=0 result="success"
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3442] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3443] audit: op="connection-add" uuid="c3966af5-7d7a-4868-be01-390c3820fab6" name="vlan22-port" pid=58278 uid=0 result="success"
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3464] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.autoconnect-priority,connection.timestamp,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,ipv6.dhcp-timeout,802-3-ethernet.mtu" pid=58278 uid=0 result="success"
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3484] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3486] audit: op="connection-add" uuid="67fd86b0-4a99-4ef9-b440-ebf66ee17208" name="br-ex-if" pid=58278 uid=0 result="success"
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3609] audit: op="connection-update" uuid="8c0b1a26-8ccb-5a0f-94f9-24f522105883" name="ci-private-network" args="connection.controller,connection.port-type,connection.slave-type,connection.master,connection.timestamp,ipv4.routes,ipv4.dns,ipv4.never-default,ipv4.addresses,ipv4.routing-rules,ipv4.method,ipv6.addr-gen-mode,ipv6.dns,ipv6.addresses,ipv6.routing-rules,ipv6.method,ipv6.routes,ovs-external-ids.data,ovs-interface.type" pid=58278 uid=0 result="success"
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3629] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3630] audit: op="connection-add" uuid="9f83de59-69d4-4923-9214-20251e303b4c" name="vlan20-if" pid=58278 uid=0 result="success"
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3646] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3647] audit: op="connection-add" uuid="77b3d3a4-2b6e-4b89-bfe0-78235627a798" name="vlan21-if" pid=58278 uid=0 result="success"
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3662] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3664] audit: op="connection-add" uuid="a0f2e252-f7b7-4a66-baa1-e2163f7c04c6" name="vlan22-if" pid=58278 uid=0 result="success"
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3677] audit: op="connection-delete" uuid="0ae45ab3-5e67-3c3b-898d-4a6bebc55a91" name="Wired connection 1" pid=58278 uid=0 result="success"
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3698] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <warn>  [1769840036.3700] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3707] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3710] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (d530fd1c-8505-4155-a9bf-a2c66df92831)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3711] audit: op="connection-activate" uuid="d530fd1c-8505-4155-a9bf-a2c66df92831" name="br-ex-br" pid=58278 uid=0 result="success"
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3712] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <warn>  [1769840036.3713] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3718] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3722] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (a98bc08e-0dc8-4ade-ae9b-ea3da6d6c0be)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3723] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <warn>  [1769840036.3724] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3729] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3732] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (fa693a01-73fd-4edf-a6ba-b2966defc47f)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3734] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <warn>  [1769840036.3735] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3741] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3745] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (fefc2eff-2426-4575-95bc-23458dfa5b8c)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3747] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <warn>  [1769840036.3747] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3752] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3755] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (8025abf4-5f19-447d-b972-a52e513e1729)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3757] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <warn>  [1769840036.3758] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3762] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3766] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (c3966af5-7d7a-4868-be01-390c3820fab6)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3767] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3769] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3771] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3776] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <warn>  [1769840036.3777] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3779] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3783] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (67fd86b0-4a99-4ef9-b440-ebf66ee17208)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3784] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3787] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3788] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3790] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3791] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3801] device (eth1): disconnecting for new activation request.
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3802] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3804] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3806] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3807] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3810] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <warn>  [1769840036.3811] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3813] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3817] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (9f83de59-69d4-4923-9214-20251e303b4c)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3817] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3820] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3821] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3822] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3826] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <warn>  [1769840036.3837] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3840] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3844] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (77b3d3a4-2b6e-4b89-bfe0-78235627a798)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3845] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3847] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3849] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3850] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3853] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <warn>  [1769840036.3854] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3857] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3860] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (a0f2e252-f7b7-4a66-baa1-e2163f7c04c6)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3860] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3863] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3871] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3873] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3874] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3885] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,802-3-ethernet.mtu" pid=58278 uid=0 result="success"
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3887] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3890] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3891] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3902] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3906] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3908] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3910] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 kernel: ovs-system: entered promiscuous mode
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3924] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 systemd-udevd[58283]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:13:56 np0005603500 kernel: Timeout policy base is empty
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3931] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3935] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3939] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3941] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3946] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3949] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3951] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3953] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3958] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3963] dhcp4 (eth0): canceled DHCP transaction
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3963] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3964] dhcp4 (eth0): state changed no lease
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3965] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3974] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.3977] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58278 uid=0 result="fail" reason="Device is not activated"
Jan 31 01:13:56 np0005603500 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 31 01:13:56 np0005603500 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4151] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4157] dhcp4 (eth0): state changed new lease, address=38.102.83.9
Jan 31 01:13:56 np0005603500 kernel: br-ex: entered promiscuous mode
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4272] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 31 01:13:56 np0005603500 systemd-udevd[58282]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:13:56 np0005603500 kernel: vlan20: entered promiscuous mode
Jan 31 01:13:56 np0005603500 kernel: vlan21: entered promiscuous mode
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4366] device (eth1): Activation: starting connection 'ci-private-network' (8c0b1a26-8ccb-5a0f-94f9-24f522105883)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4371] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4375] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4392] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4401] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4406] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4409] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4414] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4416] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4423] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4427] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4429] device (eth1): state change: config -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4430] device (eth1): released from controller device eth1
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4434] device (eth1): disconnecting for new activation request.
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4435] audit: op="connection-activate" uuid="8c0b1a26-8ccb-5a0f-94f9-24f522105883" name="ci-private-network" pid=58278 uid=0 result="success"
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4435] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4438] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4439] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4443] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4446] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4450] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4452] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4456] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4458] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4462] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4465] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4468] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4471] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 31 01:13:56 np0005603500 kernel: vlan22: entered promiscuous mode
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4477] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 31 01:13:56 np0005603500 systemd-udevd[58284]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4552] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4553] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4559] device (eth1): Activation: starting connection 'ci-private-network' (8c0b1a26-8ccb-5a0f-94f9-24f522105883)
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4562] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58278 uid=0 result="success"
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4573] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4577] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4583] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4610] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4622] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4625] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4632] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4641] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4648] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4652] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4655] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4660] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4671] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4672] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4673] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4677] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4683] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4688] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4693] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4697] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4703] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4708] device (eth1): Activation: successful, device activated.
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4723] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4730] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 01:13:56 np0005603500 NetworkManager[55506]: <info>  [1769840036.4734] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 31 01:13:57 np0005603500 NetworkManager[55506]: <info>  [1769840037.6379] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58278 uid=0 result="success"
Jan 31 01:13:57 np0005603500 NetworkManager[55506]: <info>  [1769840037.7733] checkpoint[0x55f1de536950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 31 01:13:57 np0005603500 NetworkManager[55506]: <info>  [1769840037.7736] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58278 uid=0 result="success"
Jan 31 01:13:58 np0005603500 NetworkManager[55506]: <info>  [1769840038.0510] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58278 uid=0 result="success"
Jan 31 01:13:58 np0005603500 NetworkManager[55506]: <info>  [1769840038.0524] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58278 uid=0 result="success"
Jan 31 01:13:58 np0005603500 NetworkManager[55506]: <info>  [1769840038.2504] audit: op="networking-control" arg="global-dns-configuration" pid=58278 uid=0 result="success"
Jan 31 01:13:58 np0005603500 NetworkManager[55506]: <info>  [1769840038.2565] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 31 01:13:58 np0005603500 NetworkManager[55506]: <info>  [1769840038.2821] audit: op="networking-control" arg="global-dns-configuration" pid=58278 uid=0 result="success"
Jan 31 01:13:58 np0005603500 NetworkManager[55506]: <info>  [1769840038.2855] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58278 uid=0 result="success"
Jan 31 01:13:58 np0005603500 NetworkManager[55506]: <info>  [1769840038.4044] checkpoint[0x55f1de536a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 31 01:13:58 np0005603500 NetworkManager[55506]: <info>  [1769840038.4050] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58278 uid=0 result="success"
Jan 31 01:13:58 np0005603500 python3.9[58617]: ansible-ansible.legacy.async_status Invoked with jid=j698970713324.58272 mode=status _async_dir=/root/.ansible_async
Jan 31 01:13:58 np0005603500 ansible-async_wrapper.py[58276]: Module complete (58276)
Jan 31 01:13:59 np0005603500 ansible-async_wrapper.py[58275]: Done in kid B.
Jan 31 01:14:01 np0005603500 python3.9[58722]: ansible-ansible.legacy.async_status Invoked with jid=j698970713324.58272 mode=status _async_dir=/root/.ansible_async
Jan 31 01:14:02 np0005603500 python3.9[58821]: ansible-ansible.legacy.async_status Invoked with jid=j698970713324.58272 mode=cleanup _async_dir=/root/.ansible_async
Jan 31 01:14:03 np0005603500 python3.9[58973]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:14:03 np0005603500 python3.9[59096]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840042.6741483-317-171237928460515/.source.returncode _original_basename=.b40upzxc follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:14:04 np0005603500 python3.9[59248]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:14:04 np0005603500 python3.9[59371]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840043.757353-333-124582562603057/.source.cfg _original_basename=.6w803i_h follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:14:05 np0005603500 python3.9[59524]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 01:14:05 np0005603500 systemd[1]: Reloading Network Manager...
Jan 31 01:14:05 np0005603500 NetworkManager[55506]: <info>  [1769840045.4260] audit: op="reload" arg="0" pid=59528 uid=0 result="success"
Jan 31 01:14:05 np0005603500 NetworkManager[55506]: <info>  [1769840045.4267] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 31 01:14:05 np0005603500 systemd[1]: Reloaded Network Manager.
Jan 31 01:14:05 np0005603500 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 31 01:14:06 np0005603500 systemd[1]: session-11.scope: Deactivated successfully.
Jan 31 01:14:06 np0005603500 systemd[1]: session-11.scope: Consumed 45.678s CPU time.
Jan 31 01:14:06 np0005603500 systemd-logind[821]: Session 11 logged out. Waiting for processes to exit.
Jan 31 01:14:06 np0005603500 systemd-logind[821]: Removed session 11.
Jan 31 01:14:11 np0005603500 systemd-logind[821]: New session 12 of user zuul.
Jan 31 01:14:11 np0005603500 systemd[1]: Started Session 12 of User zuul.
Jan 31 01:14:11 np0005603500 python3.9[59714]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:14:12 np0005603500 python3.9[59868]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 01:14:13 np0005603500 python3.9[60057]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:14:14 np0005603500 systemd[1]: session-12.scope: Deactivated successfully.
Jan 31 01:14:14 np0005603500 systemd[1]: session-12.scope: Consumed 2.018s CPU time.
Jan 31 01:14:14 np0005603500 systemd-logind[821]: Session 12 logged out. Waiting for processes to exit.
Jan 31 01:14:14 np0005603500 systemd-logind[821]: Removed session 12.
Jan 31 01:14:15 np0005603500 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 31 01:14:20 np0005603500 systemd-logind[821]: New session 13 of user zuul.
Jan 31 01:14:20 np0005603500 systemd[1]: Started Session 13 of User zuul.
Jan 31 01:14:21 np0005603500 python3.9[60242]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:14:22 np0005603500 python3.9[60396]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:14:23 np0005603500 python3.9[60552]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 01:14:24 np0005603500 python3.9[60636]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 01:14:26 np0005603500 python3.9[60790]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 01:14:27 np0005603500 python3.9[60981]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:14:28 np0005603500 python3.9[61134]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:14:28 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:14:29 np0005603500 python3.9[61298]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:14:29 np0005603500 python3.9[61376]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:14:30 np0005603500 python3.9[61528]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:14:30 np0005603500 python3.9[61606]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:14:31 np0005603500 python3.9[61758]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:14:31 np0005603500 python3.9[61910]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:14:32 np0005603500 python3.9[62062]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:14:32 np0005603500 python3.9[62214]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:14:33 np0005603500 python3.9[62366]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 01:14:35 np0005603500 python3.9[62519]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:14:36 np0005603500 python3.9[62673]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:14:37 np0005603500 python3.9[62825]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:14:37 np0005603500 python3.9[62977]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:14:38 np0005603500 python3.9[63130]: ansible-service_facts Invoked
Jan 31 01:14:38 np0005603500 network[63147]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 01:14:38 np0005603500 network[63148]: 'network-scripts' will be removed from distribution in near future.
Jan 31 01:14:38 np0005603500 network[63149]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 01:14:43 np0005603500 python3.9[63601]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 01:14:45 np0005603500 python3.9[63754]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 31 01:14:46 np0005603500 python3.9[63906]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:14:47 np0005603500 python3.9[64031]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840086.4021597-227-166323513151250/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:14:48 np0005603500 python3.9[64185]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:14:48 np0005603500 python3.9[64310]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840087.9042115-242-75871000899429/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:14:49 np0005603500 python3.9[64464]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:14:50 np0005603500 python3.9[64618]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 01:14:52 np0005603500 python3.9[64702]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:14:53 np0005603500 python3.9[64856]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 01:14:54 np0005603500 python3.9[64940]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 01:14:54 np0005603500 chronyd[830]: chronyd exiting
Jan 31 01:14:54 np0005603500 systemd[1]: Stopping NTP client/server...
Jan 31 01:14:54 np0005603500 systemd[1]: chronyd.service: Deactivated successfully.
Jan 31 01:14:54 np0005603500 systemd[1]: Stopped NTP client/server.
Jan 31 01:14:54 np0005603500 systemd[1]: Starting NTP client/server...
Jan 31 01:14:54 np0005603500 chronyd[64949]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 31 01:14:54 np0005603500 chronyd[64949]: Frequency -31.484 +/- 0.302 ppm read from /var/lib/chrony/drift
Jan 31 01:14:54 np0005603500 chronyd[64949]: Loaded seccomp filter (level 2)
Jan 31 01:14:54 np0005603500 systemd[1]: Started NTP client/server.
Jan 31 01:14:54 np0005603500 systemd[1]: session-13.scope: Deactivated successfully.
Jan 31 01:14:54 np0005603500 systemd[1]: session-13.scope: Consumed 21.747s CPU time.
Jan 31 01:14:54 np0005603500 systemd-logind[821]: Session 13 logged out. Waiting for processes to exit.
Jan 31 01:14:54 np0005603500 systemd-logind[821]: Removed session 13.
Jan 31 01:15:01 np0005603500 systemd-logind[821]: New session 14 of user zuul.
Jan 31 01:15:01 np0005603500 systemd[1]: Started Session 14 of User zuul.
Jan 31 01:15:02 np0005603500 python3.9[65128]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:15:03 np0005603500 python3.9[65284]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:03 np0005603500 python3.9[65459]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:15:04 np0005603500 python3.9[65537]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.w3w6i2sh recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:05 np0005603500 python3.9[65689]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:15:05 np0005603500 python3.9[65812]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840104.8146262-56-195223698456231/.source _original_basename=.9rknr087 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:06 np0005603500 python3.9[65964]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:15:07 np0005603500 python3.9[66116]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:15:07 np0005603500 python3.9[66239]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840106.6271071-80-203293259003985/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:15:08 np0005603500 python3.9[66391]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:15:08 np0005603500 python3.9[66514]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840107.6517746-80-265897185757495/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:15:09 np0005603500 python3.9[66666]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:09 np0005603500 python3.9[66818]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:15:10 np0005603500 python3.9[66941]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840109.4769032-117-255159754572475/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:10 np0005603500 python3.9[67093]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:15:11 np0005603500 python3.9[67216]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840110.5296488-132-84317471034906/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:12 np0005603500 python3.9[67368]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:15:12 np0005603500 systemd[1]: Reloading.
Jan 31 01:15:12 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:15:12 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:15:12 np0005603500 systemd[1]: Reloading.
Jan 31 01:15:13 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:15:13 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:15:13 np0005603500 systemd[1]: Starting EDPM Container Shutdown...
Jan 31 01:15:13 np0005603500 systemd[1]: Finished EDPM Container Shutdown.
Jan 31 01:15:13 np0005603500 python3.9[67595]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:15:14 np0005603500 python3.9[67718]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840113.3635633-155-21057106720410/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:14 np0005603500 python3.9[67870]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:15:15 np0005603500 python3.9[67993]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840114.4568717-170-118749342574491/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:16 np0005603500 python3.9[68145]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:15:16 np0005603500 systemd[1]: Reloading.
Jan 31 01:15:16 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:15:16 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:15:16 np0005603500 systemd[1]: Reloading.
Jan 31 01:15:16 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:15:16 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:15:16 np0005603500 systemd[1]: Starting Create netns directory...
Jan 31 01:15:16 np0005603500 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 31 01:15:16 np0005603500 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 31 01:15:16 np0005603500 systemd[1]: Finished Create netns directory.
Jan 31 01:15:17 np0005603500 python3.9[68371]: ansible-ansible.builtin.service_facts Invoked
Jan 31 01:15:17 np0005603500 network[68388]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 01:15:17 np0005603500 network[68389]: 'network-scripts' will be removed from distribution in near future.
Jan 31 01:15:17 np0005603500 network[68390]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 01:15:20 np0005603500 python3.9[68652]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:15:20 np0005603500 systemd[1]: Reloading.
Jan 31 01:15:20 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:15:20 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:15:20 np0005603500 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 31 01:15:20 np0005603500 iptables.init[68691]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 31 01:15:20 np0005603500 iptables.init[68691]: iptables: Flushing firewall rules: [  OK  ]
Jan 31 01:15:20 np0005603500 systemd[1]: iptables.service: Deactivated successfully.
Jan 31 01:15:20 np0005603500 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 31 01:15:21 np0005603500 python3.9[68887]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:15:22 np0005603500 python3.9[69041]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:15:22 np0005603500 systemd[1]: Reloading.
Jan 31 01:15:22 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:15:22 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:15:22 np0005603500 systemd[1]: Starting Netfilter Tables...
Jan 31 01:15:22 np0005603500 systemd[1]: Finished Netfilter Tables.
Jan 31 01:15:23 np0005603500 python3.9[69234]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:15:24 np0005603500 python3.9[69387]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:15:24 np0005603500 python3.9[69512]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840123.5190568-239-96660352787057/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:25 np0005603500 python3.9[69665]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 01:15:25 np0005603500 systemd[1]: Reloading OpenSSH server daemon...
Jan 31 01:15:25 np0005603500 systemd[1]: Reloaded OpenSSH server daemon.
Jan 31 01:15:25 np0005603500 python3.9[69821]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:26 np0005603500 python3.9[69973]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:15:26 np0005603500 python3.9[70096]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840126.0684707-270-229442418412238/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:27 np0005603500 python3.9[70248]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 31 01:15:27 np0005603500 systemd[1]: Starting Time & Date Service...
Jan 31 01:15:27 np0005603500 systemd[1]: Started Time & Date Service.
Jan 31 01:15:28 np0005603500 python3.9[70404]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:29 np0005603500 python3.9[70556]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:15:29 np0005603500 python3.9[70679]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840128.6937761-305-234278531165981/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:30 np0005603500 python3.9[70831]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:15:30 np0005603500 python3.9[70954]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840129.7701654-320-276252516427987/.source.yaml _original_basename=.pwgvedi4 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:31 np0005603500 python3.9[71106]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:15:31 np0005603500 python3.9[71229]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840130.8873372-335-50673393798323/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:32 np0005603500 python3.9[71381]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:15:32 np0005603500 python3.9[71534]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:15:33 np0005603500 python3[71687]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 31 01:15:34 np0005603500 python3.9[71839]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:15:34 np0005603500 python3.9[71962]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840133.8795114-374-169290707397151/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:35 np0005603500 python3.9[72114]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:15:35 np0005603500 python3.9[72237]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840134.9278944-389-257819512214696/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:36 np0005603500 python3.9[72389]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:15:37 np0005603500 python3.9[72512]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840136.1849742-404-177172627961541/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:37 np0005603500 python3.9[72664]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:15:38 np0005603500 python3.9[72787]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840137.2665608-419-124864547895466/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:39 np0005603500 python3.9[72939]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:15:39 np0005603500 python3.9[73062]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840138.5070927-434-69244434559552/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:40 np0005603500 python3.9[73214]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:40 np0005603500 python3.9[73366]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:15:41 np0005603500 python3.9[73525]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:42 np0005603500 python3.9[73678]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:42 np0005603500 python3.9[73830]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:43 np0005603500 python3.9[73982]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 31 01:15:43 np0005603500 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 01:15:43 np0005603500 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 01:15:44 np0005603500 python3.9[74136]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 31 01:15:44 np0005603500 systemd[1]: session-14.scope: Deactivated successfully.
Jan 31 01:15:44 np0005603500 systemd[1]: session-14.scope: Consumed 29.836s CPU time.
Jan 31 01:15:44 np0005603500 systemd-logind[821]: Session 14 logged out. Waiting for processes to exit.
Jan 31 01:15:44 np0005603500 systemd-logind[821]: Removed session 14.
Jan 31 01:15:50 np0005603500 systemd-logind[821]: New session 15 of user zuul.
Jan 31 01:15:50 np0005603500 systemd[1]: Started Session 15 of User zuul.
Jan 31 01:15:51 np0005603500 python3.9[74317]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 31 01:15:52 np0005603500 python3.9[74469]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:15:53 np0005603500 python3.9[74621]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:15:54 np0005603500 python3.9[74773]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCgeV+xQFI1bLR2csi65phXibO0pXJlXpO/o3Lv2Qwsu+xuCJ+W90jVcv2ifmPrVf6DSgwBCOGqEFOLKS0rflCHJKYV4Y5Q2ID0/2Kq8UIlxW4lzSP2Wkd1BRbkwBJhXzZXDCHSzKpf9NKNEFs83YyOHXUfWd9iBC8yIzXb0SiZ2lJs3yzmfL8KFLH1HTU2Nw0ZwZNT9nHphaSixFZU6//MAFKMO2irnCmCxPYNoOU/iX3pLlYVjeWelQeWQHmOH53rWGvJqbHkeXmEVSamS4EA3z0a4yYMFynKougAOyFFgMjV1Z+R76EPPcMlyEHuZPfoXWP22sOwVDoUc2fRGmbFY9wSWr9+8ZMkQNhWtL9wKmBmsT5fo0ai2CCCxlwa3uys6bqqhgJwpB6aAXVWrfiwe7pBKqwUMPWQY3qACsBTHVA/TqQGf5WUIlRB6HCSlbS8pfL2uJnqbJWb+NfgnISObbVucy9XubeJA8ToyrXXYKIrWoY5r4BQemJ7dECofSM=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICS+/8xeaeM22OEr85SMM2zEgzTZMQtGhfp12nh4sbQ1#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJQu6Q+8m6rmZOp1cFykfJrVmvyhOpcoKYZ09bcYL0tOyxgOohdV/pzpe/HAqgssljzrQQJMrV5bWP2nHUxl1Sg=#012 create=True mode=0644 path=/tmp/ansible.okl31bzl state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:54 np0005603500 python3.9[74925]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.okl31bzl' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:15:56 np0005603500 python3.9[75079]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.okl31bzl state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:15:56 np0005603500 systemd-logind[821]: Session 15 logged out. Waiting for processes to exit.
Jan 31 01:15:56 np0005603500 systemd[1]: session-15.scope: Deactivated successfully.
Jan 31 01:15:56 np0005603500 systemd[1]: session-15.scope: Consumed 3.430s CPU time.
Jan 31 01:15:56 np0005603500 systemd-logind[821]: Removed session 15.
Jan 31 01:15:57 np0005603500 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 31 01:16:01 np0005603500 systemd-logind[821]: New session 16 of user zuul.
Jan 31 01:16:01 np0005603500 systemd[1]: Started Session 16 of User zuul.
Jan 31 01:16:02 np0005603500 python3.9[75259]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:16:04 np0005603500 python3.9[75415]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 31 01:16:04 np0005603500 python3.9[75569]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 01:16:05 np0005603500 python3.9[75722]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:16:06 np0005603500 python3.9[75875]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:16:07 np0005603500 python3.9[76029]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:16:07 np0005603500 python3.9[76184]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:16:08 np0005603500 systemd[1]: session-16.scope: Deactivated successfully.
Jan 31 01:16:08 np0005603500 systemd[1]: session-16.scope: Consumed 3.882s CPU time.
Jan 31 01:16:08 np0005603500 systemd-logind[821]: Session 16 logged out. Waiting for processes to exit.
Jan 31 01:16:08 np0005603500 systemd-logind[821]: Removed session 16.
Jan 31 01:16:13 np0005603500 systemd-logind[821]: New session 17 of user zuul.
Jan 31 01:16:13 np0005603500 systemd[1]: Started Session 17 of User zuul.
Jan 31 01:16:14 np0005603500 python3.9[76362]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:16:15 np0005603500 python3.9[76518]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 01:16:16 np0005603500 python3.9[76602]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 31 01:16:18 np0005603500 python3.9[76753]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:16:19 np0005603500 python3.9[76905]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 31 01:16:19 np0005603500 python3.9[77055]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:16:20 np0005603500 python3.9[77205]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:16:20 np0005603500 systemd[1]: session-17.scope: Deactivated successfully.
Jan 31 01:16:20 np0005603500 systemd[1]: session-17.scope: Consumed 5.266s CPU time.
Jan 31 01:16:20 np0005603500 systemd-logind[821]: Session 17 logged out. Waiting for processes to exit.
Jan 31 01:16:20 np0005603500 systemd-logind[821]: Removed session 17.
Jan 31 01:16:26 np0005603500 systemd-logind[821]: New session 18 of user zuul.
Jan 31 01:16:26 np0005603500 systemd[1]: Started Session 18 of User zuul.
Jan 31 01:16:27 np0005603500 python3.9[77383]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:16:29 np0005603500 python3.9[77539]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:16:29 np0005603500 python3.9[77691]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:16:30 np0005603500 python3.9[77843]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:16:31 np0005603500 python3.9[77966]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840189.8642683-60-112007091970317/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=74e2c4dfd565814970da17100f51904b26b958a7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:16:31 np0005603500 python3.9[78118]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:16:32 np0005603500 python3.9[78241]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840191.1744292-60-108822099432229/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=71eaa8989cee63ee59d80ec7b453b6273e09969b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:16:32 np0005603500 python3.9[78393]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:16:33 np0005603500 python3.9[78516]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840192.2268224-60-189442386770299/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=7e2b3f365e1c513a5b79ead5790305a5ac7417ca backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:16:33 np0005603500 python3.9[78668]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:16:34 np0005603500 python3.9[78820]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:16:34 np0005603500 python3.9[78972]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:16:35 np0005603500 python3.9[79095]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840194.48963-119-154819765938068/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=c79ca7b36574f70aa13c7986658c847ed1d6c7ed backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:16:35 np0005603500 python3.9[79247]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:16:36 np0005603500 python3.9[79370]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840195.544312-119-121314992745152/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=222d68470e018d504c578fc766ef01f640fb7395 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:16:37 np0005603500 python3.9[79522]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:16:37 np0005603500 python3.9[79645]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840196.6873913-119-6263489128119/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=de098607a7c2152453bc06f3cb9ed09159142992 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:16:38 np0005603500 python3.9[79797]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:16:38 np0005603500 python3.9[79949]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:16:39 np0005603500 python3.9[80101]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:16:39 np0005603500 python3.9[80224]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840199.0483701-178-38060726408645/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=22a66c691368a13ad49771bbf13f01b3d2666818 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:16:40 np0005603500 python3.9[80376]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:16:40 np0005603500 python3.9[80499]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840200.0908256-178-67754617348260/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=f55e3304644b2f29c59355448ba7c2c84f865adf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:16:41 np0005603500 python3.9[80651]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:16:41 np0005603500 python3.9[80774]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840201.0944412-178-186313076184822/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=e57acdc5e17401636da763c69147e8052712b902 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:16:42 np0005603500 python3.9[80926]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:16:43 np0005603500 python3.9[81078]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:16:43 np0005603500 python3.9[81230]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:16:44 np0005603500 python3.9[81353]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840203.3377426-237-160290034808564/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=c2d9711d7309c44ab9d26d9c6514d7a729df7257 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:16:44 np0005603500 python3.9[81505]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:16:45 np0005603500 python3.9[81628]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840204.402043-237-228296141320081/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=f55e3304644b2f29c59355448ba7c2c84f865adf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:16:45 np0005603500 python3.9[81780]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:16:46 np0005603500 python3.9[81903]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840205.5423558-237-4266919811647/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=3520a666637af42741b8dc4a18f5c8739237a58d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:16:47 np0005603500 python3.9[82055]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:16:48 np0005603500 python3.9[82207]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:16:48 np0005603500 python3.9[82330]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840207.847964-305-139044887746450/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21080916ed436367f9d55ee0e8f00e696634482f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:16:49 np0005603500 python3.9[82482]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:16:50 np0005603500 python3.9[82634]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:16:50 np0005603500 python3.9[82758]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840209.7644196-329-191086528707525/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21080916ed436367f9d55ee0e8f00e696634482f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:16:51 np0005603500 python3.9[82911]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:16:52 np0005603500 python3.9[83063]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:16:52 np0005603500 python3.9[83186]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840211.5819619-353-223767335719867/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21080916ed436367f9d55ee0e8f00e696634482f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:16:53 np0005603500 python3.9[83338]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:16:53 np0005603500 python3.9[83490]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:16:54 np0005603500 python3.9[83613]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840213.476442-377-46739951003531/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21080916ed436367f9d55ee0e8f00e696634482f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:16:55 np0005603500 python3.9[83765]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:16:55 np0005603500 python3.9[83917]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:16:56 np0005603500 python3.9[84040]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840215.322157-401-3650256506206/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21080916ed436367f9d55ee0e8f00e696634482f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:16:56 np0005603500 python3.9[84192]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:16:58 np0005603500 python3.9[84344]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:16:58 np0005603500 python3.9[84467]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840217.1302574-425-211970462379157/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21080916ed436367f9d55ee0e8f00e696634482f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:16:59 np0005603500 python3.9[84619]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:16:59 np0005603500 python3.9[84771]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:17:00 np0005603500 python3.9[84894]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840219.4010327-449-216383969820424/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21080916ed436367f9d55ee0e8f00e696634482f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:00 np0005603500 systemd[1]: session-18.scope: Deactivated successfully.
Jan 31 01:17:00 np0005603500 systemd[1]: session-18.scope: Consumed 25.572s CPU time.
Jan 31 01:17:00 np0005603500 systemd-logind[821]: Session 18 logged out. Waiting for processes to exit.
Jan 31 01:17:00 np0005603500 systemd-logind[821]: Removed session 18.
Jan 31 01:17:03 np0005603500 chronyd[64949]: Selected source 167.160.187.12 (pool.ntp.org)
Jan 31 01:17:07 np0005603500 systemd-logind[821]: New session 19 of user zuul.
Jan 31 01:17:07 np0005603500 systemd[1]: Started Session 19 of User zuul.
Jan 31 01:17:08 np0005603500 python3.9[85072]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:17:09 np0005603500 python3.9[85228]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:17:09 np0005603500 python3.9[85380]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:17:10 np0005603500 python3.9[85530]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:17:11 np0005603500 python3.9[85682]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 31 01:17:13 np0005603500 dbus-broker-launch[801]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 31 01:17:14 np0005603500 python3.9[85838]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 01:17:15 np0005603500 python3.9[85922]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 01:17:17 np0005603500 python3.9[86075]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 01:17:18 np0005603500 python3[86230]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 31 01:17:18 np0005603500 python3.9[86382]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:19 np0005603500 python3.9[86534]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:17:20 np0005603500 python3.9[86612]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:20 np0005603500 python3.9[86764]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:17:21 np0005603500 python3.9[86842]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.qx_15ok7 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:21 np0005603500 python3.9[86994]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:17:22 np0005603500 python3.9[87072]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:23 np0005603500 python3.9[87224]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:17:23 np0005603500 python3[87377]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 31 01:17:24 np0005603500 python3.9[87529]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:17:25 np0005603500 python3.9[87654]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840244.053041-152-131724946103827/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:25 np0005603500 python3.9[87806]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:17:26 np0005603500 python3.9[87931]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840245.4492598-167-180803349249523/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:26 np0005603500 python3.9[88083]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:17:27 np0005603500 python3.9[88208]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840246.4597704-182-239944167962625/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:28 np0005603500 python3.9[88360]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:17:28 np0005603500 python3.9[88485]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840247.6208968-197-37835713739010/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:29 np0005603500 python3.9[88637]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:17:29 np0005603500 python3.9[88762]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840248.7278938-212-198580459207604/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:30 np0005603500 python3.9[88914]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:31 np0005603500 python3.9[89066]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:17:31 np0005603500 python3.9[89221]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:32 np0005603500 python3.9[89373]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:17:33 np0005603500 python3.9[89526]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:17:33 np0005603500 python3.9[89680]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:17:34 np0005603500 python3.9[89835]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:35 np0005603500 python3.9[89985]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:17:36 np0005603500 python3.9[90138]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:9e:41:65:cf" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:17:36 np0005603500 ovs-vsctl[90139]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:9e:41:65:cf external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 31 01:17:37 np0005603500 python3.9[90291]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:17:37 np0005603500 python3.9[90446]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:17:37 np0005603500 ovs-vsctl[90447]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 31 01:17:38 np0005603500 python3.9[90597]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:17:38 np0005603500 python3.9[90751]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:17:39 np0005603500 python3.9[90903]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:17:39 np0005603500 python3.9[90981]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:17:40 np0005603500 python3.9[91133]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:17:40 np0005603500 python3.9[91211]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:17:41 np0005603500 python3.9[91363]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:42 np0005603500 python3.9[91515]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:17:42 np0005603500 python3.9[91593]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:43 np0005603500 python3.9[91745]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:17:44 np0005603500 python3.9[91823]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:45 np0005603500 python3.9[91975]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:17:45 np0005603500 systemd[1]: Reloading.
Jan 31 01:17:45 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:17:45 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:17:46 np0005603500 python3.9[92164]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:17:46 np0005603500 python3.9[92242]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:47 np0005603500 python3.9[92394]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:17:47 np0005603500 python3.9[92472]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:48 np0005603500 python3.9[92624]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:17:48 np0005603500 systemd[1]: Reloading.
Jan 31 01:17:48 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:17:48 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:17:48 np0005603500 systemd[1]: Starting Create netns directory...
Jan 31 01:17:48 np0005603500 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 31 01:17:48 np0005603500 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 31 01:17:48 np0005603500 systemd[1]: Finished Create netns directory.
Jan 31 01:17:49 np0005603500 python3.9[92817]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:17:49 np0005603500 python3.9[92969]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:17:50 np0005603500 python3.9[93092]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840269.1943662-463-81579713561603/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:17:50 np0005603500 python3.9[93244]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:51 np0005603500 python3.9[93396]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:17:52 np0005603500 python3.9[93548]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:17:52 np0005603500 python3.9[93671]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840271.6242828-496-98793684467595/.source.json _original_basename=.iidsbv1d follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:53 np0005603500 python3.9[93821]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:55 np0005603500 python3.9[94244]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 31 01:17:55 np0005603500 python3.9[94396]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 31 01:17:57 np0005603500 python3[94548]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 31 01:17:57 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:17:57 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:17:57 np0005603500 podman[94585]: 2026-01-31 06:17:57.26577749 +0000 UTC m=+0.021600063 image pull 41feda93bd9d79a05e844f418942ad169ecdce2acd7eff3ec0131c566040529d quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:17:57 np0005603500 podman[94585]: 2026-01-31 06:17:57.935740193 +0000 UTC m=+0.691562716 container create 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:17:57 np0005603500 python3[94548]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:17:58 np0005603500 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 01:17:58 np0005603500 python3.9[94775]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:17:59 np0005603500 python3.9[94929]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:17:59 np0005603500 python3.9[95005]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:18:00 np0005603500 python3.9[95156]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769840279.9374726-574-121135656875285/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:18:01 np0005603500 python3.9[95232]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 01:18:01 np0005603500 systemd[1]: Reloading.
Jan 31 01:18:01 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:18:01 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:18:01 np0005603500 python3.9[95343]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:18:01 np0005603500 systemd[1]: Reloading.
Jan 31 01:18:01 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:18:01 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:18:02 np0005603500 systemd[1]: Starting ovn_controller container...
Jan 31 01:18:02 np0005603500 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 31 01:18:02 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:18:02 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fc938f7793ba0c9db756f2fd82780ef1d397b148cca6dc625526b0acd0d4ea6/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 31 01:18:02 np0005603500 systemd[1]: Started /usr/bin/podman healthcheck run 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d.
Jan 31 01:18:02 np0005603500 podman[95383]: 2026-01-31 06:18:02.216705322 +0000 UTC m=+0.148383449 container init 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: + sudo -E kolla_set_configs
Jan 31 01:18:02 np0005603500 podman[95383]: 2026-01-31 06:18:02.241025309 +0000 UTC m=+0.172703426 container start 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 31 01:18:02 np0005603500 systemd[1]: Created slice User Slice of UID 0.
Jan 31 01:18:02 np0005603500 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 31 01:18:02 np0005603500 edpm-start-podman-container[95383]: ovn_controller
Jan 31 01:18:02 np0005603500 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 31 01:18:02 np0005603500 systemd[1]: Starting User Manager for UID 0...
Jan 31 01:18:02 np0005603500 edpm-start-podman-container[95382]: Creating additional drop-in dependency for "ovn_controller" (072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d)
Jan 31 01:18:02 np0005603500 podman[95405]: 2026-01-31 06:18:02.301163541 +0000 UTC m=+0.053265429 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 31 01:18:02 np0005603500 systemd[1]: 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d-7df356f049990275.service: Main process exited, code=exited, status=1/FAILURE
Jan 31 01:18:02 np0005603500 systemd[1]: 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d-7df356f049990275.service: Failed with result 'exit-code'.
Jan 31 01:18:02 np0005603500 systemd[1]: Reloading.
Jan 31 01:18:02 np0005603500 systemd[95428]: Queued start job for default target Main User Target.
Jan 31 01:18:02 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:18:02 np0005603500 systemd[95428]: Created slice User Application Slice.
Jan 31 01:18:02 np0005603500 systemd[95428]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 31 01:18:02 np0005603500 systemd[95428]: Started Daily Cleanup of User's Temporary Directories.
Jan 31 01:18:02 np0005603500 systemd[95428]: Reached target Paths.
Jan 31 01:18:02 np0005603500 systemd[95428]: Reached target Timers.
Jan 31 01:18:02 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:18:02 np0005603500 systemd[95428]: Starting D-Bus User Message Bus Socket...
Jan 31 01:18:02 np0005603500 systemd[95428]: Starting Create User's Volatile Files and Directories...
Jan 31 01:18:02 np0005603500 systemd[95428]: Listening on D-Bus User Message Bus Socket.
Jan 31 01:18:02 np0005603500 systemd[95428]: Reached target Sockets.
Jan 31 01:18:02 np0005603500 systemd[95428]: Finished Create User's Volatile Files and Directories.
Jan 31 01:18:02 np0005603500 systemd[95428]: Reached target Basic System.
Jan 31 01:18:02 np0005603500 systemd[95428]: Reached target Main User Target.
Jan 31 01:18:02 np0005603500 systemd[95428]: Startup finished in 91ms.
Jan 31 01:18:02 np0005603500 systemd[1]: Started User Manager for UID 0.
Jan 31 01:18:02 np0005603500 systemd[1]: Started ovn_controller container.
Jan 31 01:18:02 np0005603500 systemd[1]: Started Session c1 of User root.
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: INFO:__main__:Validating config file
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: INFO:__main__:Writing out command to execute
Jan 31 01:18:02 np0005603500 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: ++ cat /run_command
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: + ARGS=
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: + sudo kolla_copy_cacerts
Jan 31 01:18:02 np0005603500 systemd[1]: Started Session c2 of User root.
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: + [[ ! -n '' ]]
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: + . kolla_extend_start
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: + umask 0022
Jan 31 01:18:02 np0005603500 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:02Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:02Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:02Z|00003|main|INFO|OVN internal version is : [24.09.4-20.37.0-77.8]
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:02Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:02Z|00005|stream_ssl|ERR|ssl:ovsdbserver-sb.openstack.svc:6642: connect: Address family not supported by protocol
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:02Z|00006|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:02Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Address family not supported by protocol)
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:02Z|00008|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:02Z|00009|ovn_util|INFO|statctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:02Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:02Z|00011|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:02Z|00012|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:02Z|00013|ovn_util|INFO|pinctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:02Z|00014|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:02Z|00015|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Jan 31 01:18:02 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:02Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Jan 31 01:18:02 np0005603500 NetworkManager[55506]: <info>  [1769840282.6529] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 31 01:18:02 np0005603500 NetworkManager[55506]: <info>  [1769840282.6538] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 01:18:02 np0005603500 NetworkManager[55506]: <warn>  [1769840282.6540] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 01:18:02 np0005603500 NetworkManager[55506]: <info>  [1769840282.6548] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Jan 31 01:18:02 np0005603500 NetworkManager[55506]: <info>  [1769840282.6555] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Jan 31 01:18:02 np0005603500 NetworkManager[55506]: <info>  [1769840282.6562] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 31 01:18:02 np0005603500 kernel: br-int: entered promiscuous mode
Jan 31 01:18:02 np0005603500 systemd-udevd[95529]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:18:03 np0005603500 python3.9[95657]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00001|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00001|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00017|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00018|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00019|ovn_util|INFO|features: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00021|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00022|features|INFO|OVS Feature: ct_flush, state: supported
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00023|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00024|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00025|main|INFO|OVS feature set changed, force recompute.
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00026|ovn_util|INFO|ofctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00027|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00028|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00029|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00030|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00031|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00032|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00033|features|INFO|OVS Feature: meter_support, state: supported
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00034|features|INFO|OVS Feature: group_support, state: supported
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00035|main|INFO|OVS feature set changed, force recompute.
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00036|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 31 01:18:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:03Z|00037|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 31 01:18:04 np0005603500 python3.9[95810]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:18:04 np0005603500 python3.9[95933]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840283.7550862-619-254422338177575/.source.yaml _original_basename=.jrdl0z0m follow=False checksum=67ca52efc285baa1f04af0a9d05447032c469125 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:18:05 np0005603500 python3.9[96085]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:18:05 np0005603500 ovs-vsctl[96086]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 31 01:18:05 np0005603500 NetworkManager[55506]: <info>  [1769840285.4344] manager: (ovn-6220b5-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 31 01:18:05 np0005603500 systemd-udevd[95531]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:18:05 np0005603500 kernel: genev_sys_6081: entered promiscuous mode
Jan 31 01:18:05 np0005603500 NetworkManager[55506]: <info>  [1769840285.4834] device (genev_sys_6081): carrier: link connected
Jan 31 01:18:05 np0005603500 NetworkManager[55506]: <info>  [1769840285.4838] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Jan 31 01:18:05 np0005603500 python3.9[96241]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:18:05 np0005603500 ovs-vsctl[96243]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 31 01:18:06 np0005603500 python3.9[96396]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:18:06 np0005603500 ovs-vsctl[96397]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 31 01:18:07 np0005603500 systemd[1]: session-19.scope: Deactivated successfully.
Jan 31 01:18:07 np0005603500 systemd[1]: session-19.scope: Consumed 40.076s CPU time.
Jan 31 01:18:07 np0005603500 systemd-logind[821]: Session 19 logged out. Waiting for processes to exit.
Jan 31 01:18:07 np0005603500 systemd-logind[821]: Removed session 19.
Jan 31 01:18:12 np0005603500 systemd-logind[821]: New session 21 of user zuul.
Jan 31 01:18:12 np0005603500 systemd[1]: Started Session 21 of User zuul.
Jan 31 01:18:12 np0005603500 systemd[1]: Stopping User Manager for UID 0...
Jan 31 01:18:12 np0005603500 systemd[95428]: Activating special unit Exit the Session...
Jan 31 01:18:12 np0005603500 systemd[95428]: Stopped target Main User Target.
Jan 31 01:18:12 np0005603500 systemd[95428]: Stopped target Basic System.
Jan 31 01:18:12 np0005603500 systemd[95428]: Stopped target Paths.
Jan 31 01:18:12 np0005603500 systemd[95428]: Stopped target Sockets.
Jan 31 01:18:12 np0005603500 systemd[95428]: Stopped target Timers.
Jan 31 01:18:12 np0005603500 systemd[95428]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 31 01:18:12 np0005603500 systemd[95428]: Closed D-Bus User Message Bus Socket.
Jan 31 01:18:12 np0005603500 systemd[95428]: Stopped Create User's Volatile Files and Directories.
Jan 31 01:18:12 np0005603500 systemd[95428]: Removed slice User Application Slice.
Jan 31 01:18:12 np0005603500 systemd[95428]: Reached target Shutdown.
Jan 31 01:18:12 np0005603500 systemd[95428]: Finished Exit the Session.
Jan 31 01:18:12 np0005603500 systemd[95428]: Reached target Exit the Session.
Jan 31 01:18:12 np0005603500 systemd[1]: user@0.service: Deactivated successfully.
Jan 31 01:18:12 np0005603500 systemd[1]: Stopped User Manager for UID 0.
Jan 31 01:18:12 np0005603500 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 31 01:18:12 np0005603500 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 31 01:18:12 np0005603500 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 31 01:18:12 np0005603500 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 31 01:18:12 np0005603500 systemd[1]: Removed slice User Slice of UID 0.
Jan 31 01:18:13 np0005603500 python3.9[96577]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:18:14 np0005603500 python3.9[96733]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:18:15 np0005603500 python3.9[96885]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:18:15 np0005603500 python3.9[97037]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:18:16 np0005603500 python3.9[97189]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:18:16 np0005603500 python3.9[97341]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:18:17 np0005603500 python3.9[97491]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:18:18 np0005603500 python3.9[97643]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 31 01:18:19 np0005603500 python3.9[97793]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:18:20 np0005603500 python3.9[97914]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840298.9386997-81-250574848591274/.source follow=False _original_basename=haproxy.j2 checksum=5e5da90e54b3adab1c0c1719eacfb6cd4ba2dd4c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:18:20 np0005603500 python3.9[98064]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:18:21 np0005603500 python3.9[98185]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840300.2674682-96-205270058651673/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:18:21 np0005603500 python3.9[98337]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 01:18:22 np0005603500 python3.9[98422]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 01:18:25 np0005603500 python3.9[98575]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 01:18:25 np0005603500 python3.9[98728]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:18:26 np0005603500 python3.9[98849]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840305.3378437-133-207714555273336/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:18:26 np0005603500 python3.9[98999]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:18:27 np0005603500 python3.9[99120]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840306.352799-133-107741659746957/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:18:28 np0005603500 python3.9[99270]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:18:28 np0005603500 python3.9[99391]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840307.9848003-177-74803752030582/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:18:29 np0005603500 python3.9[99541]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:18:29 np0005603500 python3.9[99662]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840309.11059-177-107222485739178/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:18:30 np0005603500 python3.9[99812]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:18:31 np0005603500 python3.9[99966]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:18:31 np0005603500 python3.9[100118]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:18:32 np0005603500 python3.9[100196]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:18:32 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:32Z|00038|memory|INFO|16384 kB peak resident set size after 29.8 seconds
Jan 31 01:18:32 np0005603500 ovn_controller[95398]: 2026-01-31T06:18:32Z|00039|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:471 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Jan 31 01:18:32 np0005603500 podman[100320]: 2026-01-31 06:18:32.501683692 +0000 UTC m=+0.080198945 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:18:32 np0005603500 python3.9[100361]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:18:33 np0005603500 python3.9[100453]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:18:33 np0005603500 python3.9[100605]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:18:34 np0005603500 python3.9[100757]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:18:34 np0005603500 python3.9[100835]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:18:35 np0005603500 python3.9[100987]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:18:35 np0005603500 python3.9[101065]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:18:36 np0005603500 python3.9[101218]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:18:36 np0005603500 systemd[1]: Reloading.
Jan 31 01:18:36 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:18:36 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:18:37 np0005603500 python3.9[101407]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:18:38 np0005603500 python3.9[101485]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:18:38 np0005603500 python3.9[101637]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:18:39 np0005603500 python3.9[101715]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:18:39 np0005603500 python3.9[101867]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:18:39 np0005603500 systemd[1]: Reloading.
Jan 31 01:18:39 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:18:39 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:18:39 np0005603500 systemd[1]: Starting Create netns directory...
Jan 31 01:18:39 np0005603500 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 31 01:18:39 np0005603500 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 31 01:18:39 np0005603500 systemd[1]: Finished Create netns directory.
Jan 31 01:18:40 np0005603500 python3.9[102061]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:18:41 np0005603500 python3.9[102213]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:18:41 np0005603500 python3.9[102336]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840320.7904115-328-18660701140481/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:18:42 np0005603500 python3.9[102488]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:18:43 np0005603500 python3.9[102640]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:18:43 np0005603500 python3.9[102792]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:18:44 np0005603500 python3.9[102915]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840323.2821379-361-85359346811071/.source.json _original_basename=.l3odpcdk follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:18:44 np0005603500 python3.9[103065]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:18:46 np0005603500 python3.9[103488]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 31 01:18:47 np0005603500 python3.9[103640]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 31 01:18:48 np0005603500 python3[103792]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 31 01:18:48 np0005603500 podman[103826]: 2026-01-31 06:18:48.841368853 +0000 UTC m=+0.043026888 container create b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.license=GPLv2)
Jan 31 01:18:48 np0005603500 podman[103826]: 2026-01-31 06:18:48.816985395 +0000 UTC m=+0.018643440 image pull d52ce0b189025039ce86fc9564595bcce243e95c598f912f021ea09cd4116a16 quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:18:48 np0005603500 python3[103792]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:18:49 np0005603500 python3.9[104016]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:18:50 np0005603500 python3.9[104170]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:18:50 np0005603500 python3.9[104246]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:18:51 np0005603500 python3.9[104397]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769840330.6165767-439-206130847021865/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:18:51 np0005603500 python3.9[104473]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 01:18:51 np0005603500 systemd[1]: Reloading.
Jan 31 01:18:51 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:18:51 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:18:52 np0005603500 python3.9[104584]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:18:52 np0005603500 systemd[1]: Reloading.
Jan 31 01:18:52 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:18:52 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:18:52 np0005603500 systemd[1]: Starting ovn_metadata_agent container...
Jan 31 01:18:52 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:18:53 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86d3c87e2d781e9a3e253cfd741ea65a11f37547c4a4a0e41fccd5a3f3dd82c6/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 31 01:18:53 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86d3c87e2d781e9a3e253cfd741ea65a11f37547c4a4a0e41fccd5a3f3dd82c6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 01:18:53 np0005603500 systemd[1]: Started /usr/bin/podman healthcheck run b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227.
Jan 31 01:18:53 np0005603500 podman[104624]: 2026-01-31 06:18:53.045492032 +0000 UTC m=+0.127492787 container init b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, managed_by=edpm_ansible)
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: + sudo -E kolla_set_configs
Jan 31 01:18:53 np0005603500 podman[104624]: 2026-01-31 06:18:53.076511496 +0000 UTC m=+0.158512241 container start b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:18:53 np0005603500 edpm-start-podman-container[104624]: ovn_metadata_agent
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: INFO:__main__:Validating config file
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: INFO:__main__:Copying service configuration files
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: INFO:__main__:Writing out command to execute
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 31 01:18:53 np0005603500 edpm-start-podman-container[104623]: Creating additional drop-in dependency for "ovn_metadata_agent" (b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227)
Jan 31 01:18:53 np0005603500 podman[104645]: 2026-01-31 06:18:53.123074454 +0000 UTC m=+0.040249903 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true)
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: ++ cat /run_command
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: + CMD=neutron-ovn-metadata-agent
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: + ARGS=
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: + sudo kolla_copy_cacerts
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: + [[ ! -n '' ]]
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: + . kolla_extend_start
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: Running command: 'neutron-ovn-metadata-agent'
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: + umask 0022
Jan 31 01:18:53 np0005603500 ovn_metadata_agent[104639]: + exec neutron-ovn-metadata-agent
Jan 31 01:18:53 np0005603500 systemd[1]: Reloading.
Jan 31 01:18:53 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:18:53 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:18:53 np0005603500 systemd[1]: Started ovn_metadata_agent container.
Jan 31 01:18:54 np0005603500 python3.9[104878]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.787 104644 INFO neutron.common.config [-] Logging enabled!
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.787 104644 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 26.1.0.dev143
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.787 104644 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:124
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.788 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2804
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.788 104644 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2805
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.788 104644 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2806
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.789 104644 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2807
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.789 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2809
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.789 104644 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.789 104644 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.789 104644 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.789 104644 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.789 104644 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.789 104644 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.789 104644 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.790 104644 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.790 104644 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.790 104644 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.790 104644 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.790 104644 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.790 104644 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.790 104644 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.790 104644 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.790 104644 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.791 104644 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.791 104644 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.791 104644 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.791 104644 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.791 104644 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.791 104644 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.791 104644 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.791 104644 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.792 104644 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.792 104644 DEBUG neutron.agent.ovn.metadata_agent [-] enable_signals                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.792 104644 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.792 104644 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.792 104644 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.792 104644 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.793 104644 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.793 104644 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.793 104644 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.793 104644 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.793 104644 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.793 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.793 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.793 104644 DEBUG neutron.agent.ovn.metadata_agent [-] log_color                      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.794 104644 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.794 104644 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.794 104644 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.794 104644 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.794 104644 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.794 104644 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.794 104644 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.794 104644 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.794 104644 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.794 104644 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.795 104644 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.795 104644 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.795 104644 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.795 104644 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.795 104644 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.795 104644 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.795 104644 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.795 104644 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.795 104644 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.795 104644 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.796 104644 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.796 104644 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.796 104644 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.796 104644 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.796 104644 DEBUG neutron.agent.ovn.metadata_agent [-] my_ip                          = 38.102.83.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.796 104644 DEBUG neutron.agent.ovn.metadata_agent [-] my_ipv6                        = ::1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.796 104644 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.796 104644 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.796 104644 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.797 104644 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.797 104644 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.797 104644 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.797 104644 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.797 104644 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.797 104644 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.797 104644 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.797 104644 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.797 104644 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.797 104644 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.798 104644 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.798 104644 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.798 104644 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.798 104644 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.798 104644 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.798 104644 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.798 104644 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.798 104644 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.798 104644 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.799 104644 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.799 104644 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.799 104644 DEBUG neutron.agent.ovn.metadata_agent [-] shell_completion               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.799 104644 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.799 104644 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.799 104644 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.799 104644 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.799 104644 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.799 104644 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.799 104644 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.800 104644 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.800 104644 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.800 104644 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_qinq                      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.800 104644 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.800 104644 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.800 104644 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.800 104644 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.800 104644 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.800 104644 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.800 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.801 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.801 104644 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.801 104644 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.801 104644 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.801 104644 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.801 104644 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.801 104644 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.801 104644 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.801 104644 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.802 104644 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.802 104644 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_requests        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.802 104644 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.802 104644 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.process_tags   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.802 104644 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.service_name_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.802 104644 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_otlp.service_name_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.802 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.802 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.802 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.803 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.803 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.803 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.803 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.803 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.803 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.803 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.803 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.803 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.804 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.804 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.804 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.804 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_timeout     = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.804 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.804 104644 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.804 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.804 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.804 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.804 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.805 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.805 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.805 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.805 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.805 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.805 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.805 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.805 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.805 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.806 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.806 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.806 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.806 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.806 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.806 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.806 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.806 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.806 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.806 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.807 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.807 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.807 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.807 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.807 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.807 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.807 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.807 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.807 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.807 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.808 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.808 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.808 104644 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.808 104644 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.808 104644 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.808 104644 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.808 104644 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.808 104644 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.808 104644 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.809 104644 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.809 104644 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.809 104644 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.809 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.809 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mappings            = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.809 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.datapath_type              = system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.809 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.809 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_reports         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.809 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_unregistered    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.809 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.810 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.int_peer_patch_port        = patch-tun log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.810 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.integration_bridge         = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.810 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.local_ip                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.810 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_connect_timeout         = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.810 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_inactivity_probe        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.810 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_address          = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.810 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_port             = 6633 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.810 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_request_timeout         = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.810 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.openflow_processed_per_port = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.811 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.811 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_debug                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.811 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.811 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.qos_meter_bandwidth        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.811 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_bandwidths = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.811 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_default_hypervisor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.811 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_hypervisors = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.811 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.811 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.812 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_with_direction = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.812 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_without_direction = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.812 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_ca_cert_file           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.812 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_cert_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.812 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_key_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.812 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tun_peer_patch_port        = patch-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.812 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tunnel_bridge              = br-tun log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.812 104644 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.vhostuser_socket_dir       = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.812 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.813 104644 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.813 104644 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.813 104644 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.813 104644 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.813 104644 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.813 104644 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.813 104644 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.813 104644 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.813 104644 DEBUG neutron.agent.ovn.metadata_agent [-] agent.extensions               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.813 104644 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.814 104644 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.814 104644 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.814 104644 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.814 104644 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.814 104644 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.814 104644 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.814 104644 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.814 104644 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.814 104644 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.815 104644 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.815 104644 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.815 104644 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.815 104644 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.815 104644 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.815 104644 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.815 104644 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.815 104644 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.815 104644 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.815 104644 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.816 104644 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.816 104644 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.816 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.816 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.816 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.816 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.816 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.816 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.816 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.817 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.817 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.817 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.817 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.817 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.817 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.817 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.817 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.817 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.817 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.818 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.818 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.818 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.818 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.818 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.818 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.818 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.818 104644 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.818 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.broadcast_arps_to_all_routers = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.818 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.819 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.819 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_records_ovn_owned      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.819 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.819 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.819 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.fdb_age_threshold          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.819 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.live_migration_activation_strategy = rarp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.819 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.localnet_learn_fdb         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.819 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.mac_binding_age_threshold  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.819 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.820 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.820 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.820 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.820 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.820 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.820 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.820 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.820 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = ['tcp:127.0.0.1:6641'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.820 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.821 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_router_indirect_snat   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.821 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.821 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.821 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ['ssl:ovsdbserver-sb.openstack.svc:6642'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.821 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.821 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.821 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.821 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.821 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.822 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.822 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.fdb_removal_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.822 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.ignore_lsp_down  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.822 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.mac_binding_removal_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.822 104644 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_query_rate_limit = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.822 104644 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_window_duration = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.822 104644 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_query_rate_limit = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.822 104644 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_window_duration = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.822 104644 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.ip_versions = [4] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.822 104644 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.rate_limit_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.823 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.823 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.823 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.823 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.823 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.823 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.823 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.823 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.823 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.824 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.824 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.824 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.824 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.824 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.824 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.824 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.824 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.824 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.processname = neutron-ovn-metadata-agent log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.824 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.825 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.825 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.825 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.825 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.825 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.825 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.825 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.825 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.825 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.826 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.826 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.826 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.826 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.826 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.826 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.826 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.826 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.826 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.826 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.827 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.827 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.827 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.827 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.827 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.827 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.827 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.827 104644 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.827 104644 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2828
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.836 104644 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.837 104644 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.837 104644 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.837 104644 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.837 104644 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.849 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name fe203bcd-9b71-4c38-9736-f063b4ce4137 (UUID: fe203bcd-9b71-4c38-9736-f063b4ce4137) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:419
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.885 104644 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.885 104644 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.885 104644 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Port_Binding.logical_port autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.885 104644 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.885 104644 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.889 104644 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.894 104644 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.900 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'fe203bcd-9b71-4c38-9736-f063b4ce4137'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], external_ids={}, name=fe203bcd-9b71-4c38-9736-f063b4ce4137, nb_cfg_timestamp=1769840291678, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:18:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:54.903 104644 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpde2wlhk9/privsep.sock']
Jan 31 01:18:54 np0005603500 python3.9[105032]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:18:55 np0005603500 python3.9[105167]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840334.5022805-484-149512405958143/.source.yaml _original_basename=.xxq39mc4 follow=False checksum=006e267b0e2a9b927178f6eb8efc37ebb0d06516 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:18:55 np0005603500 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 31 01:18:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:55.554 104644 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 31 01:18:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:55.555 104644 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpde2wlhk9/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:366
Jan 31 01:18:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:55.442 105168 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 31 01:18:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:55.445 105168 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 31 01:18:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:55.446 105168 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Jan 31 01:18:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:55.446 105168 INFO oslo.privsep.daemon [-] privsep daemon running as pid 105168
Jan 31 01:18:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:55.556 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[3b5f5e17-e15b-4123-92fb-85c7aae16d34]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:18:55 np0005603500 systemd[1]: session-21.scope: Deactivated successfully.
Jan 31 01:18:55 np0005603500 systemd[1]: session-21.scope: Consumed 30.374s CPU time.
Jan 31 01:18:55 np0005603500 systemd-logind[821]: Session 21 logged out. Waiting for processes to exit.
Jan 31 01:18:55 np0005603500 systemd-logind[821]: Removed session 21.
Jan 31 01:18:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:56.011 105168 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:18:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:56.011 105168 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:18:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:56.011 105168 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:18:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:56.393 105168 INFO oslo_service.backend [-] Loading backend: eventlet
Jan 31 01:18:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:56.398 105168 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Jan 31 01:18:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:56.432 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[31e7f315-123c-40ae-9efc-eee1c2dcf8e8]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:18:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:56.433 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, column=external_ids, values=({'neutron:ovn-metadata-id': '9c4b81e3-f69e-5c36-a593-dfb18db72130'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:18:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:18:56.501 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:19:00 np0005603500 systemd-logind[821]: New session 22 of user zuul.
Jan 31 01:19:00 np0005603500 systemd[1]: Started Session 22 of User zuul.
Jan 31 01:19:01 np0005603500 python3.9[105352]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:19:02 np0005603500 podman[105480]: 2026-01-31 06:19:02.871011453 +0000 UTC m=+0.116933428 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:19:02 np0005603500 python3.9[105525]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:19:04 np0005603500 python3.9[105699]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 01:19:04 np0005603500 systemd[1]: Reloading.
Jan 31 01:19:04 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:19:04 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:19:05 np0005603500 python3.9[105884]: ansible-ansible.builtin.service_facts Invoked
Jan 31 01:19:05 np0005603500 network[105901]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 01:19:05 np0005603500 network[105902]: 'network-scripts' will be removed from distribution in near future.
Jan 31 01:19:05 np0005603500 network[105903]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 01:19:07 np0005603500 python3.9[106164]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:19:08 np0005603500 python3.9[106317]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:19:09 np0005603500 python3.9[106470]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:19:09 np0005603500 python3.9[106623]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:19:10 np0005603500 python3.9[106776]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:19:11 np0005603500 python3.9[106929]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:19:11 np0005603500 python3.9[107082]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:19:12 np0005603500 python3.9[107235]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:19:13 np0005603500 python3.9[107387]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:19:13 np0005603500 python3.9[107539]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:19:14 np0005603500 python3.9[107691]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:19:14 np0005603500 python3.9[107843]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:19:15 np0005603500 python3.9[107995]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:19:15 np0005603500 python3.9[108147]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:19:16 np0005603500 python3.9[108299]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:19:16 np0005603500 python3.9[108451]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:19:17 np0005603500 python3.9[108603]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:19:17 np0005603500 python3.9[108755]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:19:18 np0005603500 python3.9[108907]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:19:19 np0005603500 python3.9[109059]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:19:19 np0005603500 python3.9[109211]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:19:20 np0005603500 python3.9[109363]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:19:20 np0005603500 python3.9[109515]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 31 01:19:21 np0005603500 python3.9[109667]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 01:19:21 np0005603500 systemd[1]: Reloading.
Jan 31 01:19:21 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:19:21 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:19:22 np0005603500 python3.9[109854]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:19:23 np0005603500 python3.9[110007]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:19:23 np0005603500 podman[110132]: 2026-01-31 06:19:23.388387551 +0000 UTC m=+0.054463236 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 31 01:19:23 np0005603500 python3.9[110179]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:19:24 np0005603500 python3.9[110332]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:19:24 np0005603500 python3.9[110485]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:19:25 np0005603500 python3.9[110638]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:19:25 np0005603500 python3.9[110791]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:19:26 np0005603500 python3.9[110944]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 31 01:19:27 np0005603500 python3.9[111097]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 31 01:19:28 np0005603500 python3.9[111255]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 31 01:19:29 np0005603500 python3.9[111415]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 01:19:30 np0005603500 python3.9[111499]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 01:19:33 np0005603500 podman[111509]: 2026-01-31 06:19:33.174863542 +0000 UTC m=+0.087360796 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127)
Jan 31 01:19:54 np0005603500 podman[111709]: 2026-01-31 06:19:54.139459654 +0000 UTC m=+0.058068358 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:19:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:19:54.889 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:19:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:19:54.890 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:19:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:19:54.890 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:20:04 np0005603500 podman[111728]: 2026-01-31 06:20:04.16435216 +0000 UTC m=+0.081341297 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 31 01:20:25 np0005603500 kernel: SELinux:  Converting 2766 SID table entries...
Jan 31 01:20:25 np0005603500 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 01:20:25 np0005603500 kernel: SELinux:  policy capability open_perms=1
Jan 31 01:20:25 np0005603500 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 01:20:25 np0005603500 kernel: SELinux:  policy capability always_check_network=0
Jan 31 01:20:25 np0005603500 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 01:20:25 np0005603500 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 01:20:25 np0005603500 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 01:20:25 np0005603500 podman[111766]: 2026-01-31 06:20:25.145584019 +0000 UTC m=+0.067271010 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 01:20:25 np0005603500 dbus-broker-launch[801]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 31 01:20:34 np0005603500 kernel: SELinux:  Converting 2766 SID table entries...
Jan 31 01:20:34 np0005603500 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 01:20:34 np0005603500 kernel: SELinux:  policy capability open_perms=1
Jan 31 01:20:34 np0005603500 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 01:20:34 np0005603500 kernel: SELinux:  policy capability always_check_network=0
Jan 31 01:20:34 np0005603500 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 01:20:34 np0005603500 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 01:20:34 np0005603500 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 01:20:35 np0005603500 dbus-broker-launch[801]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 31 01:20:35 np0005603500 podman[111794]: 2026-01-31 06:20:35.156510329 +0000 UTC m=+0.072250614 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:20:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:20:54.950 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:20:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:20:54.950 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:20:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:20:54.950 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:20:56 np0005603500 podman[121916]: 2026-01-31 06:20:56.123615164 +0000 UTC m=+0.043962052 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 01:21:06 np0005603500 podman[128711]: 2026-01-31 06:21:06.139827098 +0000 UTC m=+0.059501766 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:21:17 np0005603500 kernel: SELinux:  Converting 2767 SID table entries...
Jan 31 01:21:17 np0005603500 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 01:21:17 np0005603500 kernel: SELinux:  policy capability open_perms=1
Jan 31 01:21:17 np0005603500 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 01:21:17 np0005603500 kernel: SELinux:  policy capability always_check_network=0
Jan 31 01:21:17 np0005603500 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 01:21:17 np0005603500 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 01:21:17 np0005603500 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 01:21:19 np0005603500 dbus-broker-launch[789]: Noticed file-system modification, trigger reload.
Jan 31 01:21:19 np0005603500 dbus-broker-launch[801]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 31 01:21:19 np0005603500 dbus-broker-launch[789]: Noticed file-system modification, trigger reload.
Jan 31 01:21:27 np0005603500 systemd[1]: Stopping OpenSSH server daemon...
Jan 31 01:21:27 np0005603500 systemd[1]: sshd.service: Deactivated successfully.
Jan 31 01:21:27 np0005603500 systemd[1]: Stopped OpenSSH server daemon.
Jan 31 01:21:27 np0005603500 systemd[1]: sshd.service: Consumed 1.165s CPU time, read 32.0K from disk, written 0B to disk.
Jan 31 01:21:27 np0005603500 systemd[1]: Stopped target sshd-keygen.target.
Jan 31 01:21:27 np0005603500 systemd[1]: Stopping sshd-keygen.target...
Jan 31 01:21:27 np0005603500 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 31 01:21:27 np0005603500 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 31 01:21:27 np0005603500 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 31 01:21:27 np0005603500 systemd[1]: Reached target sshd-keygen.target.
Jan 31 01:21:27 np0005603500 systemd[1]: Starting OpenSSH server daemon...
Jan 31 01:21:27 np0005603500 systemd[1]: Started OpenSSH server daemon.
Jan 31 01:21:27 np0005603500 podman[129501]: 2026-01-31 06:21:27.054513841 +0000 UTC m=+0.065349015 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 01:21:29 np0005603500 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 01:21:29 np0005603500 systemd[1]: Starting man-db-cache-update.service...
Jan 31 01:21:29 np0005603500 systemd[1]: Reloading.
Jan 31 01:21:29 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:21:29 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:21:29 np0005603500 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 01:21:37 np0005603500 podman[132008]: 2026-01-31 06:21:37.17750453 +0000 UTC m=+0.093574871 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 01:21:37 np0005603500 python3.9[132625]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 01:21:37 np0005603500 systemd[1]: Reloading.
Jan 31 01:21:37 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:21:37 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:21:38 np0005603500 python3.9[134105]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 01:21:38 np0005603500 systemd[1]: Reloading.
Jan 31 01:21:38 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:21:38 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:21:39 np0005603500 python3.9[135668]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 01:21:39 np0005603500 systemd[1]: Reloading.
Jan 31 01:21:39 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:21:39 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:21:40 np0005603500 python3.9[137288]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 01:21:40 np0005603500 systemd[1]: Reloading.
Jan 31 01:21:40 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:21:40 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:21:41 np0005603500 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 01:21:41 np0005603500 systemd[1]: Finished man-db-cache-update.service.
Jan 31 01:21:41 np0005603500 systemd[1]: man-db-cache-update.service: Consumed 7.454s CPU time.
Jan 31 01:21:41 np0005603500 systemd[1]: run-r2b3dda8a26eb4a45bf87a36ad60d3bd7.service: Deactivated successfully.
Jan 31 01:21:41 np0005603500 python3.9[138977]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 01:21:41 np0005603500 systemd[1]: Reloading.
Jan 31 01:21:41 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:21:41 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:21:42 np0005603500 python3.9[139287]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 01:21:42 np0005603500 systemd[1]: Reloading.
Jan 31 01:21:42 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:21:42 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:21:43 np0005603500 python3.9[139477]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 01:21:43 np0005603500 systemd[1]: Reloading.
Jan 31 01:21:43 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:21:43 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:21:44 np0005603500 python3.9[139668]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 01:21:46 np0005603500 python3.9[139823]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 01:21:47 np0005603500 systemd[1]: Reloading.
Jan 31 01:21:47 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:21:47 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:21:48 np0005603500 python3.9[140013]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 01:21:48 np0005603500 systemd[1]: Reloading.
Jan 31 01:21:48 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:21:48 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:21:48 np0005603500 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 31 01:21:48 np0005603500 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 31 01:21:49 np0005603500 python3.9[140206]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 01:21:50 np0005603500 python3.9[140361]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 01:21:50 np0005603500 python3.9[140516]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 01:21:51 np0005603500 python3.9[140671]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 01:21:52 np0005603500 python3.9[140826]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 01:21:52 np0005603500 python3.9[140981]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 01:21:53 np0005603500 python3.9[141136]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 01:21:54 np0005603500 python3.9[141291]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 01:21:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:21:55.012 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:21:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:21:55.014 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:21:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:21:55.014 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:21:55 np0005603500 python3.9[141446]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 01:21:55 np0005603500 python3.9[141602]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 01:21:56 np0005603500 python3.9[141757]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 01:21:57 np0005603500 podman[141884]: 2026-01-31 06:21:57.173342531 +0000 UTC m=+0.079924358 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 31 01:21:57 np0005603500 python3.9[141927]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 01:21:58 np0005603500 python3.9[142086]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 01:21:59 np0005603500 python3.9[142241]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 01:22:00 np0005603500 python3.9[142396]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:22:00 np0005603500 python3.9[142548]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:22:01 np0005603500 python3.9[142700]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:22:01 np0005603500 python3.9[142852]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:22:02 np0005603500 python3.9[143004]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:22:03 np0005603500 python3.9[143156]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:22:03 np0005603500 python3.9[143306]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:22:04 np0005603500 python3.9[143458]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:05 np0005603500 python3.9[143583]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769840523.9784524-557-276965848778353/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:06 np0005603500 python3.9[143735]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:06 np0005603500 python3.9[143860]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769840525.5397274-557-4407145797940/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:07 np0005603500 python3.9[144012]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:07 np0005603500 podman[144109]: 2026-01-31 06:22:07.368755403 +0000 UTC m=+0.081327023 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:22:07 np0005603500 python3.9[144154]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769840526.6290946-557-4559535171193/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:08 np0005603500 python3.9[144315]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:08 np0005603500 python3.9[144440]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769840527.675036-557-43581835693223/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:09 np0005603500 python3.9[144592]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:09 np0005603500 python3.9[144717]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769840528.7193248-557-66948284037168/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:10 np0005603500 python3.9[144869]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:10 np0005603500 python3.9[144994]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769840529.8078427-557-62094636326507/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:11 np0005603500 python3.9[145146]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:11 np0005603500 python3.9[145269]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769840530.8636556-557-111712976287072/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:12 np0005603500 python3.9[145421]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:12 np0005603500 python3.9[145546]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769840532.0427685-557-13636581435497/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:13 np0005603500 python3.9[145698]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 31 01:22:14 np0005603500 python3.9[145851]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:14 np0005603500 python3.9[146003]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:15 np0005603500 python3.9[146155]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:16 np0005603500 python3.9[146307]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:16 np0005603500 python3.9[146459]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:17 np0005603500 python3.9[146611]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:17 np0005603500 python3.9[146763]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:18 np0005603500 python3.9[146915]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:18 np0005603500 python3.9[147067]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:19 np0005603500 python3.9[147219]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:20 np0005603500 python3.9[147371]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:20 np0005603500 python3.9[147523]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:21 np0005603500 python3.9[147675]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:21 np0005603500 python3.9[147827]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:22 np0005603500 python3.9[147979]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:22 np0005603500 python3.9[148102]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840541.7838216-778-45872091116384/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:23 np0005603500 python3.9[148254]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:23 np0005603500 python3.9[148377]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840542.795405-778-132219844292886/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:24 np0005603500 python3.9[148529]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:24 np0005603500 python3.9[148652]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840543.8834746-778-34584261005794/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:25 np0005603500 python3.9[148804]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:25 np0005603500 python3.9[148927]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840544.84072-778-247678858494166/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:26 np0005603500 python3.9[149079]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:26 np0005603500 python3.9[149202]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840545.8963113-778-157237387250416/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:27 np0005603500 python3.9[149354]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:27 np0005603500 podman[149449]: 2026-01-31 06:22:27.559362101 +0000 UTC m=+0.072047532 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 31 01:22:27 np0005603500 python3.9[149490]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840546.8698277-778-156163407171978/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:28 np0005603500 python3.9[149646]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:28 np0005603500 python3.9[149769]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840547.815541-778-217598027156416/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:29 np0005603500 python3.9[149921]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:29 np0005603500 python3.9[150044]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840548.809535-778-62178525682227/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:30 np0005603500 python3.9[150196]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:30 np0005603500 python3.9[150319]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840549.7929218-778-152780600875297/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:31 np0005603500 python3.9[150471]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:31 np0005603500 python3.9[150594]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840550.7349143-778-98206883311039/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:32 np0005603500 python3.9[150746]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:32 np0005603500 python3.9[150869]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840551.7447567-778-26896801994136/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:33 np0005603500 python3.9[151021]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:33 np0005603500 python3.9[151144]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840552.8798866-778-64996557576441/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:34 np0005603500 python3.9[151296]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:34 np0005603500 python3.9[151419]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840553.8923805-778-130078959333343/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:35 np0005603500 python3.9[151571]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:35 np0005603500 python3.9[151694]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840554.8912969-778-227702197639209/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:36 np0005603500 python3.9[151844]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:22:37 np0005603500 python3.9[151999]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 31 01:22:38 np0005603500 dbus-broker-launch[801]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 31 01:22:38 np0005603500 podman[152001]: 2026-01-31 06:22:38.184913579 +0000 UTC m=+0.093781424 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller)
Jan 31 01:22:39 np0005603500 python3.9[152182]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:39 np0005603500 python3.9[152334]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:40 np0005603500 python3.9[152486]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:40 np0005603500 python3.9[152638]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:41 np0005603500 python3.9[152790]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:41 np0005603500 python3.9[152942]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:42 np0005603500 python3.9[153094]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:42 np0005603500 python3.9[153246]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:43 np0005603500 python3.9[153398]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:43 np0005603500 python3.9[153550]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:44 np0005603500 python3.9[153702]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 01:22:44 np0005603500 systemd[1]: Reloading.
Jan 31 01:22:44 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:22:44 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:22:44 np0005603500 systemd[1]: Starting libvirt logging daemon socket...
Jan 31 01:22:44 np0005603500 systemd[1]: Listening on libvirt logging daemon socket.
Jan 31 01:22:44 np0005603500 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 31 01:22:44 np0005603500 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 31 01:22:44 np0005603500 systemd[1]: Starting libvirt logging daemon...
Jan 31 01:22:44 np0005603500 systemd[1]: Started libvirt logging daemon.
Jan 31 01:22:45 np0005603500 python3.9[153896]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 01:22:45 np0005603500 systemd[1]: Reloading.
Jan 31 01:22:45 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:22:45 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:22:45 np0005603500 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 31 01:22:45 np0005603500 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 31 01:22:45 np0005603500 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 31 01:22:45 np0005603500 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 31 01:22:45 np0005603500 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 31 01:22:45 np0005603500 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 31 01:22:45 np0005603500 systemd[1]: Starting libvirt nodedev daemon...
Jan 31 01:22:45 np0005603500 systemd[1]: Started libvirt nodedev daemon.
Jan 31 01:22:46 np0005603500 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 31 01:22:46 np0005603500 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 31 01:22:46 np0005603500 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 31 01:22:46 np0005603500 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 31 01:22:46 np0005603500 python3.9[154113]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 01:22:46 np0005603500 systemd[1]: Reloading.
Jan 31 01:22:46 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:22:46 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:22:46 np0005603500 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 31 01:22:46 np0005603500 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 31 01:22:46 np0005603500 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 31 01:22:46 np0005603500 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 31 01:22:46 np0005603500 systemd[1]: Starting libvirt proxy daemon...
Jan 31 01:22:46 np0005603500 systemd[1]: Started libvirt proxy daemon.
Jan 31 01:22:47 np0005603500 setroubleshoot[153959]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 6e256026-675b-45a7-9bdc-bd191d845889
Jan 31 01:22:47 np0005603500 setroubleshoot[153959]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 31 01:22:47 np0005603500 setroubleshoot[153959]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 6e256026-675b-45a7-9bdc-bd191d845889
Jan 31 01:22:47 np0005603500 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 01:22:47 np0005603500 setroubleshoot[153959]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 31 01:22:47 np0005603500 python3.9[154333]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 01:22:47 np0005603500 systemd[1]: Reloading.
Jan 31 01:22:47 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:22:47 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:22:47 np0005603500 systemd[1]: Listening on libvirt locking daemon socket.
Jan 31 01:22:47 np0005603500 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 31 01:22:47 np0005603500 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 31 01:22:47 np0005603500 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 31 01:22:47 np0005603500 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 31 01:22:47 np0005603500 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 31 01:22:47 np0005603500 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 31 01:22:47 np0005603500 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 31 01:22:47 np0005603500 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 31 01:22:47 np0005603500 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 31 01:22:47 np0005603500 systemd[1]: Starting libvirt QEMU daemon...
Jan 31 01:22:47 np0005603500 systemd[1]: Started libvirt QEMU daemon.
Jan 31 01:22:48 np0005603500 python3.9[154550]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 01:22:48 np0005603500 systemd[1]: Reloading.
Jan 31 01:22:48 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:22:48 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:22:48 np0005603500 systemd[1]: Starting libvirt secret daemon socket...
Jan 31 01:22:48 np0005603500 systemd[1]: Listening on libvirt secret daemon socket.
Jan 31 01:22:48 np0005603500 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 31 01:22:48 np0005603500 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 31 01:22:48 np0005603500 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 31 01:22:48 np0005603500 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 31 01:22:48 np0005603500 systemd[1]: Starting libvirt secret daemon...
Jan 31 01:22:48 np0005603500 systemd[1]: Started libvirt secret daemon.
Jan 31 01:22:49 np0005603500 python3.9[154761]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:50 np0005603500 python3.9[154913]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 31 01:22:50 np0005603500 python3.9[155065]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:51 np0005603500 python3.9[155188]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840570.440835-1123-277569220964962/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:52 np0005603500 python3.9[155340]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:52 np0005603500 python3.9[155492]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:53 np0005603500 python3.9[155570]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:53 np0005603500 python3.9[155722]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:54 np0005603500 python3.9[155800]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.1t93psol recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:54 np0005603500 python3.9[155952]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:22:55.076 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:22:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:22:55.078 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:22:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:22:55.078 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:22:55 np0005603500 python3.9[156030]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:55 np0005603500 python3.9[156183]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:22:56 np0005603500 python3[156336]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 31 01:22:57 np0005603500 python3.9[156488]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:57 np0005603500 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 31 01:22:57 np0005603500 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.009s CPU time.
Jan 31 01:22:57 np0005603500 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 31 01:22:57 np0005603500 python3.9[156566]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:57 np0005603500 podman[156690]: 2026-01-31 06:22:57.973403266 +0000 UTC m=+0.053547661 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true)
Jan 31 01:22:58 np0005603500 python3.9[156734]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:58 np0005603500 python3.9[156861]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840577.7074575-1212-92819058259428/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:22:59 np0005603500 python3.9[157013]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:22:59 np0005603500 python3.9[157091]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:00 np0005603500 python3.9[157243]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:23:00 np0005603500 python3.9[157321]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:01 np0005603500 python3.9[157473]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:23:01 np0005603500 python3.9[157598]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840580.7474563-1251-29452142645627/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:02 np0005603500 python3.9[157750]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:02 np0005603500 python3.9[157902]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:23:03 np0005603500 python3.9[158057]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:04 np0005603500 python3.9[158209]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:23:04 np0005603500 python3.9[158362]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:23:05 np0005603500 python3.9[158516]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:23:06 np0005603500 python3.9[158671]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:06 np0005603500 python3.9[158823]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:23:07 np0005603500 python3.9[158946]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840586.203016-1323-195419634739090/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:07 np0005603500 python3.9[159098]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:23:08 np0005603500 python3.9[159221]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840587.1842384-1338-248159643962110/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:08 np0005603500 podman[159345]: 2026-01-31 06:23:08.534546926 +0000 UTC m=+0.097597784 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller)
Jan 31 01:23:08 np0005603500 python3.9[159394]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:23:09 np0005603500 python3.9[159522]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840588.2270305-1353-65623503428759/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:09 np0005603500 python3.9[159674]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:23:09 np0005603500 systemd[1]: Reloading.
Jan 31 01:23:09 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:23:09 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:23:10 np0005603500 systemd[1]: Reached target edpm_libvirt.target.
Jan 31 01:23:10 np0005603500 python3.9[159865]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 31 01:23:10 np0005603500 systemd[1]: Reloading.
Jan 31 01:23:10 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:23:10 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:23:11 np0005603500 systemd[1]: Reloading.
Jan 31 01:23:11 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:23:11 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:23:11 np0005603500 systemd[1]: session-22.scope: Deactivated successfully.
Jan 31 01:23:11 np0005603500 systemd[1]: session-22.scope: Consumed 2min 49.002s CPU time.
Jan 31 01:23:11 np0005603500 systemd-logind[821]: Session 22 logged out. Waiting for processes to exit.
Jan 31 01:23:11 np0005603500 systemd-logind[821]: Removed session 22.
Jan 31 01:23:17 np0005603500 systemd-logind[821]: New session 23 of user zuul.
Jan 31 01:23:17 np0005603500 systemd[1]: Started Session 23 of User zuul.
Jan 31 01:23:18 np0005603500 python3.9[160115]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:23:19 np0005603500 python3.9[160269]: ansible-ansible.builtin.service_facts Invoked
Jan 31 01:23:19 np0005603500 network[160286]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 01:23:19 np0005603500 network[160287]: 'network-scripts' will be removed from distribution in near future.
Jan 31 01:23:19 np0005603500 network[160288]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 01:23:22 np0005603500 python3.9[160559]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 01:23:23 np0005603500 python3.9[160643]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 01:23:28 np0005603500 podman[160645]: 2026-01-31 06:23:28.129320481 +0000 UTC m=+0.045939330 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 01:23:29 np0005603500 python3.9[160816]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:23:30 np0005603500 python3.9[160968]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:23:31 np0005603500 python3.9[161121]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:23:31 np0005603500 python3.9[161273]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:23:32 np0005603500 python3.9[161426]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:23:32 np0005603500 python3.9[161549]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840611.9273398-90-60574763468654/.source.iscsi _original_basename=.oy4njllu follow=False checksum=2f60ce6c1ab91c6f7007004e4b60aa9acf59cff3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:33 np0005603500 python3.9[161701]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:34 np0005603500 python3.9[161853]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:35 np0005603500 python3.9[162005]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:23:35 np0005603500 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 31 01:23:36 np0005603500 python3.9[162161]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:23:36 np0005603500 systemd[1]: Reloading.
Jan 31 01:23:36 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:23:36 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:23:36 np0005603500 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 31 01:23:36 np0005603500 systemd[1]: Starting Open-iSCSI...
Jan 31 01:23:36 np0005603500 kernel: Loading iSCSI transport class v2.0-870.
Jan 31 01:23:36 np0005603500 systemd[1]: Started Open-iSCSI.
Jan 31 01:23:36 np0005603500 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 31 01:23:36 np0005603500 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 31 01:23:37 np0005603500 python3.9[162360]: ansible-ansible.builtin.service_facts Invoked
Jan 31 01:23:37 np0005603500 network[162377]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 01:23:37 np0005603500 network[162378]: 'network-scripts' will be removed from distribution in near future.
Jan 31 01:23:37 np0005603500 network[162379]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 01:23:38 np0005603500 podman[162405]: 2026-01-31 06:23:38.663456096 +0000 UTC m=+0.081382680 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller)
Jan 31 01:23:40 np0005603500 python3.9[162677]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 01:23:42 np0005603500 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 01:23:42 np0005603500 systemd[1]: Starting man-db-cache-update.service...
Jan 31 01:23:42 np0005603500 systemd[1]: Reloading.
Jan 31 01:23:42 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:23:42 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:23:43 np0005603500 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 01:23:43 np0005603500 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 01:23:43 np0005603500 systemd[1]: Finished man-db-cache-update.service.
Jan 31 01:23:43 np0005603500 systemd[1]: run-r31b36a7c02124f27bef5c5d3c64c20d7.service: Deactivated successfully.
Jan 31 01:23:44 np0005603500 python3.9[162994]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 31 01:23:44 np0005603500 python3.9[163146]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 31 01:23:45 np0005603500 python3.9[163302]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:23:45 np0005603500 python3.9[163425]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840625.0056925-178-172639169684660/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:46 np0005603500 python3.9[163577]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:47 np0005603500 python3.9[163729]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 01:23:47 np0005603500 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 31 01:23:47 np0005603500 systemd[1]: Stopped Load Kernel Modules.
Jan 31 01:23:47 np0005603500 systemd[1]: Stopping Load Kernel Modules...
Jan 31 01:23:47 np0005603500 systemd[1]: Starting Load Kernel Modules...
Jan 31 01:23:47 np0005603500 systemd[1]: Finished Load Kernel Modules.
Jan 31 01:23:48 np0005603500 python3.9[163885]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:23:48 np0005603500 python3.9[164038]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:23:49 np0005603500 python3.9[164190]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:23:49 np0005603500 python3.9[164313]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840629.094944-229-112366591803441/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:50 np0005603500 python3.9[164465]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:23:51 np0005603500 python3.9[164618]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:51 np0005603500 python3.9[164770]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:52 np0005603500 python3.9[164922]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:53 np0005603500 python3.9[165074]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:53 np0005603500 python3.9[165226]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:54 np0005603500 python3.9[165378]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:55 np0005603500 python3.9[165530]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:23:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:23:55.140 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:23:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:23:55.141 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:23:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:23:55.141 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:23:55 np0005603500 python3.9[165683]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:23:56 np0005603500 python3.9[165837]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:23:57 np0005603500 python3.9[165990]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:23:57 np0005603500 systemd[1]: Listening on multipathd control socket.
Jan 31 01:23:57 np0005603500 python3.9[166146]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:23:57 np0005603500 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 31 01:23:57 np0005603500 udevadm[166151]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 31 01:23:57 np0005603500 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 31 01:23:57 np0005603500 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 31 01:23:57 np0005603500 multipathd[166154]: --------start up--------
Jan 31 01:23:57 np0005603500 multipathd[166154]: read /etc/multipath.conf
Jan 31 01:23:57 np0005603500 multipathd[166154]: path checkers start up
Jan 31 01:23:57 np0005603500 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 31 01:23:58 np0005603500 podman[166285]: 2026-01-31 06:23:58.513503093 +0000 UTC m=+0.060497641 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 31 01:23:58 np0005603500 python3.9[166326]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 31 01:23:59 np0005603500 python3.9[166485]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 31 01:23:59 np0005603500 kernel: Key type psk registered
Jan 31 01:23:59 np0005603500 python3.9[166647]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:24:00 np0005603500 python3.9[166770]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840639.5252128-359-89111362923023/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:01 np0005603500 python3.9[166922]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:01 np0005603500 python3.9[167074]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 01:24:01 np0005603500 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 31 01:24:01 np0005603500 systemd[1]: Stopped Load Kernel Modules.
Jan 31 01:24:01 np0005603500 systemd[1]: Stopping Load Kernel Modules...
Jan 31 01:24:01 np0005603500 systemd[1]: Starting Load Kernel Modules...
Jan 31 01:24:01 np0005603500 systemd[1]: Finished Load Kernel Modules.
Jan 31 01:24:02 np0005603500 python3.9[167230]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 01:24:04 np0005603500 systemd[1]: Reloading.
Jan 31 01:24:04 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:24:04 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:24:04 np0005603500 systemd[1]: Reloading.
Jan 31 01:24:04 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:24:04 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:24:05 np0005603500 systemd-logind[821]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 31 01:24:05 np0005603500 systemd-logind[821]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 31 01:24:05 np0005603500 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 01:24:05 np0005603500 systemd[1]: Starting man-db-cache-update.service...
Jan 31 01:24:05 np0005603500 systemd[1]: Reloading.
Jan 31 01:24:05 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:24:05 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:24:05 np0005603500 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 01:24:07 np0005603500 python3.9[168696]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 01:24:07 np0005603500 systemd[1]: Stopping Open-iSCSI...
Jan 31 01:24:07 np0005603500 iscsid[162201]: iscsid shutting down.
Jan 31 01:24:07 np0005603500 systemd[1]: iscsid.service: Deactivated successfully.
Jan 31 01:24:07 np0005603500 systemd[1]: Stopped Open-iSCSI.
Jan 31 01:24:07 np0005603500 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 31 01:24:07 np0005603500 systemd[1]: Starting Open-iSCSI...
Jan 31 01:24:07 np0005603500 systemd[1]: Started Open-iSCSI.
Jan 31 01:24:08 np0005603500 python3.9[168853]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 01:24:08 np0005603500 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 31 01:24:08 np0005603500 multipathd[166154]: exit (signal)
Jan 31 01:24:08 np0005603500 multipathd[166154]: --------shut down-------
Jan 31 01:24:08 np0005603500 systemd[1]: multipathd.service: Deactivated successfully.
Jan 31 01:24:08 np0005603500 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 31 01:24:08 np0005603500 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 31 01:24:08 np0005603500 multipathd[168859]: --------start up--------
Jan 31 01:24:08 np0005603500 multipathd[168859]: read /etc/multipath.conf
Jan 31 01:24:08 np0005603500 multipathd[168859]: path checkers start up
Jan 31 01:24:08 np0005603500 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 31 01:24:09 np0005603500 podman[168990]: 2026-01-31 06:24:09.040308762 +0000 UTC m=+0.084488868 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 01:24:09 np0005603500 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 01:24:09 np0005603500 systemd[1]: Finished man-db-cache-update.service.
Jan 31 01:24:09 np0005603500 systemd[1]: man-db-cache-update.service: Consumed 1.455s CPU time.
Jan 31 01:24:09 np0005603500 systemd[1]: run-r12ce6eb1e48a453280d8355050eee19a.service: Deactivated successfully.
Jan 31 01:24:09 np0005603500 python3.9[169028]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:24:10 np0005603500 python3.9[169201]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:11 np0005603500 python3.9[169353]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 01:24:11 np0005603500 systemd[1]: Reloading.
Jan 31 01:24:11 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:24:11 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:24:11 np0005603500 python3.9[169538]: ansible-ansible.builtin.service_facts Invoked
Jan 31 01:24:12 np0005603500 network[169555]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 01:24:12 np0005603500 network[169556]: 'network-scripts' will be removed from distribution in near future.
Jan 31 01:24:12 np0005603500 network[169557]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 01:24:15 np0005603500 python3.9[169829]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:24:16 np0005603500 python3.9[169982]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:24:17 np0005603500 python3.9[170135]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:24:17 np0005603500 python3.9[170288]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:24:18 np0005603500 python3.9[170441]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:24:19 np0005603500 python3.9[170594]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:24:20 np0005603500 python3.9[170747]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:24:20 np0005603500 python3.9[170900]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:24:21 np0005603500 python3.9[171053]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:22 np0005603500 python3.9[171205]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:22 np0005603500 python3.9[171357]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:23 np0005603500 python3.9[171509]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:23 np0005603500 python3.9[171661]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:24 np0005603500 python3.9[171813]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:24 np0005603500 python3.9[171965]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:25 np0005603500 python3.9[172117]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:26 np0005603500 python3.9[172269]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:26 np0005603500 python3.9[172421]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:27 np0005603500 python3.9[172573]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:27 np0005603500 python3.9[172725]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:28 np0005603500 python3.9[172877]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:28 np0005603500 podman[173001]: 2026-01-31 06:24:28.954475692 +0000 UTC m=+0.092708157 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 01:24:29 np0005603500 python3.9[173046]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:29 np0005603500 python3.9[173200]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:30 np0005603500 python3.9[173352]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:30 np0005603500 python3.9[173504]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:24:31 np0005603500 python3.9[173656]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 31 01:24:32 np0005603500 python3.9[173808]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 01:24:32 np0005603500 systemd[1]: Reloading.
Jan 31 01:24:32 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:24:32 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:24:33 np0005603500 python3.9[173995]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:24:33 np0005603500 python3.9[174148]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:24:34 np0005603500 python3.9[174301]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:24:35 np0005603500 python3.9[174454]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:24:35 np0005603500 python3.9[174607]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:24:36 np0005603500 python3.9[174760]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:24:36 np0005603500 python3.9[174913]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:24:37 np0005603500 python3.9[175066]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:24:39 np0005603500 python3.9[175219]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:24:39 np0005603500 podman[175220]: 2026-01-31 06:24:39.21124836 +0000 UTC m=+0.123913072 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:24:39 np0005603500 python3.9[175398]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:24:40 np0005603500 python3.9[175550]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:24:40 np0005603500 python3.9[175702]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:24:41 np0005603500 python3.9[175854]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:24:42 np0005603500 python3.9[176006]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:24:42 np0005603500 python3.9[176158]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:24:43 np0005603500 python3.9[176310]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:24:43 np0005603500 python3.9[176462]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:24:44 np0005603500 python3.9[176614]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:24:45 np0005603500 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 31 01:24:46 np0005603500 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 31 01:24:47 np0005603500 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 31 01:24:48 np0005603500 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 31 01:24:49 np0005603500 python3.9[176770]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 31 01:24:49 np0005603500 python3.9[176923]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 31 01:24:50 np0005603500 python3.9[177081]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 31 01:24:51 np0005603500 systemd-logind[821]: New session 24 of user zuul.
Jan 31 01:24:51 np0005603500 systemd[1]: Started Session 24 of User zuul.
Jan 31 01:24:51 np0005603500 systemd[1]: session-24.scope: Deactivated successfully.
Jan 31 01:24:51 np0005603500 systemd-logind[821]: Session 24 logged out. Waiting for processes to exit.
Jan 31 01:24:51 np0005603500 systemd-logind[821]: Removed session 24.
Jan 31 01:24:52 np0005603500 python3.9[177267]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:24:53 np0005603500 python3.9[177388]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840692.1632373-966-198808855733530/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:24:53 np0005603500 python3.9[177538]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:24:54 np0005603500 python3.9[177614]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:24:54 np0005603500 python3.9[177764]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:24:55 np0005603500 python3.9[177885]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840694.20941-966-187272868716550/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:24:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:24:55.202 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:24:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:24:55.204 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:24:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:24:55.204 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:24:55 np0005603500 python3.9[178036]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:24:56 np0005603500 python3.9[178157]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840695.209428-966-152974146483070/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:24:56 np0005603500 python3.9[178307]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:24:57 np0005603500 python3.9[178428]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840696.192718-966-129061885010531/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:24:57 np0005603500 python3.9[178578]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:24:58 np0005603500 python3.9[178699]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840697.1612194-966-20933420324279/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:24:58 np0005603500 python3.9[178851]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:59 np0005603500 podman[178975]: 2026-01-31 06:24:59.100422934 +0000 UTC m=+0.057106894 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 31 01:24:59 np0005603500 python3.9[179019]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:24:59 np0005603500 python3.9[179173]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:25:00 np0005603500 python3.9[179325]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:25:00 np0005603500 python3.9[179448]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769840699.9999-1073-144976352776465/.source _original_basename=.e805t01g follow=False checksum=8a26b2da820a58fd50ac1d9f22e31a18163ba77c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 31 01:25:01 np0005603500 python3.9[179600]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:25:02 np0005603500 python3.9[179752]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:25:02 np0005603500 python3.9[179873]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840701.8211586-1099-157063626929290/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=83b9c03a02ddfc823170411b431db5a6cc858b19 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:25:03 np0005603500 python3.9[180023]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:25:03 np0005603500 python3.9[180144]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840702.8928804-1114-190657377079607/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=fdb8381541fc24bc28a007a7156c3bbda176d7ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:25:04 np0005603500 python3.9[180296]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 31 01:25:05 np0005603500 python3.9[180448]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 31 01:25:06 np0005603500 python3[180600]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 31 01:25:07 np0005603500 podman[180635]: 2026-01-31 06:25:07.028208599 +0000 UTC m=+0.053250802 container create becbe0ceaca7c4faf9c895830b012cfdc855f21860396cd715b4db758c95fbad (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=nova_compute_init, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible)
Jan 31 01:25:07 np0005603500 podman[180635]: 2026-01-31 06:25:06.997431428 +0000 UTC m=+0.022473701 image pull 9464506e605c3736f039205df9460679aa5b9e23fa6c2ca013e2f0c1365f627e quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:25:07 np0005603500 python3[180600]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0 bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 31 01:25:07 np0005603500 python3.9[180824]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:25:08 np0005603500 python3.9[180978]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 31 01:25:09 np0005603500 python3.9[181130]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 31 01:25:09 np0005603500 podman[181254]: 2026-01-31 06:25:09.943443849 +0000 UTC m=+0.086004086 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 31 01:25:10 np0005603500 python3[181301]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 31 01:25:10 np0005603500 podman[181345]: 2026-01-31 06:25:10.33793653 +0000 UTC m=+0.048794142 container create 6fea441eb65c817752261bb714d055cd3026d27eafdf5ce4809afba5fd7fa164 (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=nova_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Jan 31 01:25:10 np0005603500 podman[181345]: 2026-01-31 06:25:10.31419471 +0000 UTC m=+0.025052342 image pull 9464506e605c3736f039205df9460679aa5b9e23fa6c2ca013e2f0c1365f627e quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:25:10 np0005603500 python3[181301]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0 kolla_start
Jan 31 01:25:11 np0005603500 python3.9[181535]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:25:11 np0005603500 python3.9[181689]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:25:12 np0005603500 python3.9[181840]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769840711.7823582-1210-162080129243200/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:25:12 np0005603500 python3.9[181916]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 01:25:12 np0005603500 systemd[1]: Reloading.
Jan 31 01:25:12 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:25:12 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:25:14 np0005603500 python3.9[182026]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:25:14 np0005603500 systemd[1]: Reloading.
Jan 31 01:25:14 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:25:14 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:25:14 np0005603500 systemd[1]: Starting nova_compute container...
Jan 31 01:25:14 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:25:14 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8cae84a593678604cdce7c6279f26905d91079595e254f4786b8c38913c649db/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 31 01:25:14 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8cae84a593678604cdce7c6279f26905d91079595e254f4786b8c38913c649db/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 31 01:25:14 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8cae84a593678604cdce7c6279f26905d91079595e254f4786b8c38913c649db/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 31 01:25:14 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8cae84a593678604cdce7c6279f26905d91079595e254f4786b8c38913c649db/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 31 01:25:14 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8cae84a593678604cdce7c6279f26905d91079595e254f4786b8c38913c649db/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 31 01:25:14 np0005603500 podman[182065]: 2026-01-31 06:25:14.728261256 +0000 UTC m=+0.158438982 container init 6fea441eb65c817752261bb714d055cd3026d27eafdf5ce4809afba5fd7fa164 (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=nova_compute, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=edpm, container_name=nova_compute)
Jan 31 01:25:14 np0005603500 podman[182065]: 2026-01-31 06:25:14.73409346 +0000 UTC m=+0.164271166 container start 6fea441eb65c817752261bb714d055cd3026d27eafdf5ce4809afba5fd7fa164 (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=nova_compute, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true)
Jan 31 01:25:14 np0005603500 nova_compute[182080]: + sudo -E kolla_set_configs
Jan 31 01:25:14 np0005603500 podman[182065]: nova_compute
Jan 31 01:25:14 np0005603500 systemd[1]: Started nova_compute container.
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Validating config file
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Copying service configuration files
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Deleting /etc/ceph
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Creating directory /etc/ceph
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Setting permission for /etc/ceph
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Writing out command to execute
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 31 01:25:14 np0005603500 nova_compute[182080]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 31 01:25:14 np0005603500 nova_compute[182080]: ++ cat /run_command
Jan 31 01:25:14 np0005603500 nova_compute[182080]: + CMD=nova-compute
Jan 31 01:25:14 np0005603500 nova_compute[182080]: + ARGS=
Jan 31 01:25:14 np0005603500 nova_compute[182080]: + sudo kolla_copy_cacerts
Jan 31 01:25:14 np0005603500 nova_compute[182080]: + [[ ! -n '' ]]
Jan 31 01:25:14 np0005603500 nova_compute[182080]: + . kolla_extend_start
Jan 31 01:25:14 np0005603500 nova_compute[182080]: + echo 'Running command: '\''nova-compute'\'''
Jan 31 01:25:14 np0005603500 nova_compute[182080]: Running command: 'nova-compute'
Jan 31 01:25:14 np0005603500 nova_compute[182080]: + umask 0022
Jan 31 01:25:14 np0005603500 nova_compute[182080]: + exec nova-compute
Jan 31 01:25:15 np0005603500 python3.9[182241]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:25:16 np0005603500 python3.9[182391]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:25:16 np0005603500 python3.9[182541]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:25:16 np0005603500 nova_compute[182080]: 2026-01-31 06:25:16.970 182084 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 31 01:25:16 np0005603500 nova_compute[182080]: 2026-01-31 06:25:16.970 182084 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 31 01:25:16 np0005603500 nova_compute[182080]: 2026-01-31 06:25:16.971 182084 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 31 01:25:16 np0005603500 nova_compute[182080]: 2026-01-31 06:25:16.971 182084 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 31 01:25:17 np0005603500 nova_compute[182080]: 2026-01-31 06:25:17.020 182084 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:25:17 np0005603500 nova_compute[182080]: 2026-01-31 06:25:17.029 182084 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:25:17 np0005603500 nova_compute[182080]: 2026-01-31 06:25:17.030 182084 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:423
Jan 31 01:25:17 np0005603500 nova_compute[182080]: 2026-01-31 06:25:17.065 182084 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Jan 31 01:25:17 np0005603500 nova_compute[182080]: 2026-01-31 06:25:17.068 182084 WARNING oslo_config.cfg [None req-8bc98e68-11ab-4781-817b-7c4cad660cff - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Jan 31 01:25:17 np0005603500 python3.9[182697]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 31 01:25:17 np0005603500 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 01:25:17 np0005603500 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 01:25:18 np0005603500 nova_compute[182080]: 2026-01-31 06:25:18.256 182084 INFO nova.virt.driver [None req-8bc98e68-11ab-4781-817b-7c4cad660cff - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 31 01:25:18 np0005603500 nova_compute[182080]: 2026-01-31 06:25:18.371 182084 INFO nova.compute.provider_config [None req-8bc98e68-11ab-4781-817b-7c4cad660cff - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 31 01:25:18 np0005603500 python3.9[182873]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 01:25:18 np0005603500 systemd[1]: Stopping nova_compute container...
Jan 31 01:25:18 np0005603500 systemd[1]: libpod-6fea441eb65c817752261bb714d055cd3026d27eafdf5ce4809afba5fd7fa164.scope: Deactivated successfully.
Jan 31 01:25:18 np0005603500 systemd[1]: libpod-6fea441eb65c817752261bb714d055cd3026d27eafdf5ce4809afba5fd7fa164.scope: Consumed 2.513s CPU time.
Jan 31 01:25:18 np0005603500 podman[182877]: 2026-01-31 06:25:18.723858773 +0000 UTC m=+0.045600050 container died 6fea441eb65c817752261bb714d055cd3026d27eafdf5ce4809afba5fd7fa164 (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:25:18 np0005603500 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6fea441eb65c817752261bb714d055cd3026d27eafdf5ce4809afba5fd7fa164-userdata-shm.mount: Deactivated successfully.
Jan 31 01:25:18 np0005603500 systemd[1]: var-lib-containers-storage-overlay-8cae84a593678604cdce7c6279f26905d91079595e254f4786b8c38913c649db-merged.mount: Deactivated successfully.
Jan 31 01:25:18 np0005603500 podman[182877]: 2026-01-31 06:25:18.776929788 +0000 UTC m=+0.098671075 container cleanup 6fea441eb65c817752261bb714d055cd3026d27eafdf5ce4809afba5fd7fa164 (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=nova_compute, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 01:25:18 np0005603500 podman[182877]: nova_compute
Jan 31 01:25:18 np0005603500 podman[182905]: nova_compute
Jan 31 01:25:18 np0005603500 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 31 01:25:18 np0005603500 systemd[1]: Stopped nova_compute container.
Jan 31 01:25:18 np0005603500 systemd[1]: Starting nova_compute container...
Jan 31 01:25:18 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:25:18 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8cae84a593678604cdce7c6279f26905d91079595e254f4786b8c38913c649db/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 31 01:25:18 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8cae84a593678604cdce7c6279f26905d91079595e254f4786b8c38913c649db/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 31 01:25:18 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8cae84a593678604cdce7c6279f26905d91079595e254f4786b8c38913c649db/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 31 01:25:18 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8cae84a593678604cdce7c6279f26905d91079595e254f4786b8c38913c649db/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 31 01:25:18 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8cae84a593678604cdce7c6279f26905d91079595e254f4786b8c38913c649db/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 31 01:25:18 np0005603500 podman[182918]: 2026-01-31 06:25:18.959138869 +0000 UTC m=+0.102753964 container init 6fea441eb65c817752261bb714d055cd3026d27eafdf5ce4809afba5fd7fa164 (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=nova_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:25:18 np0005603500 podman[182918]: 2026-01-31 06:25:18.966484771 +0000 UTC m=+0.110099776 container start 6fea441eb65c817752261bb714d055cd3026d27eafdf5ce4809afba5fd7fa164 (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3)
Jan 31 01:25:18 np0005603500 podman[182918]: nova_compute
Jan 31 01:25:18 np0005603500 nova_compute[182934]: + sudo -E kolla_set_configs
Jan 31 01:25:18 np0005603500 systemd[1]: Started nova_compute container.
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Validating config file
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Copying service configuration files
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Deleting /etc/ceph
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Creating directory /etc/ceph
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Setting permission for /etc/ceph
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Writing out command to execute
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 31 01:25:19 np0005603500 nova_compute[182934]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 31 01:25:19 np0005603500 nova_compute[182934]: ++ cat /run_command
Jan 31 01:25:19 np0005603500 nova_compute[182934]: + CMD=nova-compute
Jan 31 01:25:19 np0005603500 nova_compute[182934]: + ARGS=
Jan 31 01:25:19 np0005603500 nova_compute[182934]: + sudo kolla_copy_cacerts
Jan 31 01:25:19 np0005603500 nova_compute[182934]: + [[ ! -n '' ]]
Jan 31 01:25:19 np0005603500 nova_compute[182934]: + . kolla_extend_start
Jan 31 01:25:19 np0005603500 nova_compute[182934]: + echo 'Running command: '\''nova-compute'\'''
Jan 31 01:25:19 np0005603500 nova_compute[182934]: Running command: 'nova-compute'
Jan 31 01:25:19 np0005603500 nova_compute[182934]: + umask 0022
Jan 31 01:25:19 np0005603500 nova_compute[182934]: + exec nova-compute
Jan 31 01:25:19 np0005603500 python3.9[183097]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 31 01:25:19 np0005603500 systemd[1]: Started libpod-conmon-becbe0ceaca7c4faf9c895830b012cfdc855f21860396cd715b4db758c95fbad.scope.
Jan 31 01:25:19 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:25:19 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57c4d14bc54502fb0024a428384a25199d09ec56974b33f1011a36da2b8787b1/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 31 01:25:19 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57c4d14bc54502fb0024a428384a25199d09ec56974b33f1011a36da2b8787b1/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 31 01:25:19 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57c4d14bc54502fb0024a428384a25199d09ec56974b33f1011a36da2b8787b1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 31 01:25:19 np0005603500 podman[183122]: 2026-01-31 06:25:19.904196277 +0000 UTC m=+0.175289754 container init becbe0ceaca7c4faf9c895830b012cfdc855f21860396cd715b4db758c95fbad (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=nova_compute_init, config_id=edpm, container_name=nova_compute_init, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 01:25:19 np0005603500 podman[183122]: 2026-01-31 06:25:19.912668134 +0000 UTC m=+0.183761651 container start becbe0ceaca7c4faf9c895830b012cfdc855f21860396cd715b4db758c95fbad (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=nova_compute_init, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3)
Jan 31 01:25:19 np0005603500 python3.9[183097]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 31 01:25:19 np0005603500 nova_compute_init[183144]: INFO:nova_statedir:Applying nova statedir ownership
Jan 31 01:25:19 np0005603500 nova_compute_init[183144]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 31 01:25:19 np0005603500 nova_compute_init[183144]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 31 01:25:19 np0005603500 nova_compute_init[183144]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 31 01:25:19 np0005603500 nova_compute_init[183144]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 31 01:25:19 np0005603500 nova_compute_init[183144]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 31 01:25:19 np0005603500 nova_compute_init[183144]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 31 01:25:19 np0005603500 nova_compute_init[183144]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 31 01:25:19 np0005603500 nova_compute_init[183144]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 31 01:25:19 np0005603500 nova_compute_init[183144]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 31 01:25:19 np0005603500 nova_compute_init[183144]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 31 01:25:19 np0005603500 nova_compute_init[183144]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 31 01:25:19 np0005603500 nova_compute_init[183144]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 31 01:25:19 np0005603500 nova_compute_init[183144]: INFO:nova_statedir:Nova statedir ownership complete
Jan 31 01:25:19 np0005603500 systemd[1]: libpod-becbe0ceaca7c4faf9c895830b012cfdc855f21860396cd715b4db758c95fbad.scope: Deactivated successfully.
Jan 31 01:25:19 np0005603500 podman[183145]: 2026-01-31 06:25:19.987485885 +0000 UTC m=+0.038724663 container died becbe0ceaca7c4faf9c895830b012cfdc855f21860396cd715b4db758c95fbad (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=nova_compute_init, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 31 01:25:20 np0005603500 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-becbe0ceaca7c4faf9c895830b012cfdc855f21860396cd715b4db758c95fbad-userdata-shm.mount: Deactivated successfully.
Jan 31 01:25:20 np0005603500 systemd[1]: var-lib-containers-storage-overlay-57c4d14bc54502fb0024a428384a25199d09ec56974b33f1011a36da2b8787b1-merged.mount: Deactivated successfully.
Jan 31 01:25:20 np0005603500 podman[183152]: 2026-01-31 06:25:20.113172932 +0000 UTC m=+0.138190552 container cleanup becbe0ceaca7c4faf9c895830b012cfdc855f21860396cd715b4db758c95fbad (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 31 01:25:20 np0005603500 systemd[1]: libpod-conmon-becbe0ceaca7c4faf9c895830b012cfdc855f21860396cd715b4db758c95fbad.scope: Deactivated successfully.
Jan 31 01:25:20 np0005603500 systemd[1]: session-23.scope: Deactivated successfully.
Jan 31 01:25:20 np0005603500 systemd[1]: session-23.scope: Consumed 1min 24.631s CPU time.
Jan 31 01:25:20 np0005603500 systemd-logind[821]: Session 23 logged out. Waiting for processes to exit.
Jan 31 01:25:20 np0005603500 systemd-logind[821]: Removed session 23.
Jan 31 01:25:21 np0005603500 nova_compute[182934]: 2026-01-31 06:25:21.059 182938 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 31 01:25:21 np0005603500 nova_compute[182934]: 2026-01-31 06:25:21.059 182938 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 31 01:25:21 np0005603500 nova_compute[182934]: 2026-01-31 06:25:21.060 182938 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 31 01:25:21 np0005603500 nova_compute[182934]: 2026-01-31 06:25:21.060 182938 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 31 01:25:21 np0005603500 nova_compute[182934]: 2026-01-31 06:25:21.100 182938 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:25:21 np0005603500 nova_compute[182934]: 2026-01-31 06:25:21.113 182938 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:25:21 np0005603500 nova_compute[182934]: 2026-01-31 06:25:21.114 182938 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:423
Jan 31 01:25:21 np0005603500 nova_compute[182934]: 2026-01-31 06:25:21.146 182938 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Jan 31 01:25:21 np0005603500 nova_compute[182934]: 2026-01-31 06:25:21.149 182938 WARNING oslo_config.cfg [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Jan 31 01:25:22 np0005603500 nova_compute[182934]: 2026-01-31 06:25:22.632 182938 INFO nova.virt.driver [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 31 01:25:22 np0005603500 nova_compute[182934]: 2026-01-31 06:25:22.772 182938 INFO nova.compute.provider_config [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.354 182938 DEBUG oslo_concurrency.lockutils [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.355 182938 DEBUG oslo_concurrency.lockutils [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.356 182938 DEBUG oslo_concurrency.lockutils [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.357 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/service.py:357
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.357 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2804
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.358 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2805
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.358 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2806
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.358 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2807
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.359 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2809
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.359 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.359 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.360 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.360 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.360 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.361 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.361 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.361 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.362 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.362 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.363 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.363 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.363 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.364 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.364 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.365 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.365 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.365 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.366 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.366 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.367 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.367 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.367 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.368 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.368 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.369 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.369 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.369 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.370 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.370 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.370 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.371 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.371 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.372 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.372 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.373 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.373 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.373 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.374 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.374 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.374 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.375 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.375 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.375 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.376 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.376 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.376 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.377 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.377 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.377 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.378 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.378 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.379 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.379 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.379 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.380 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.380 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.381 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.381 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.381 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.382 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.382 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.382 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.383 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.383 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.384 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.384 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.384 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.385 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.385 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.386 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.386 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.386 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.387 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.387 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.388 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.388 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.389 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.389 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.389 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.390 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.390 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.391 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.391 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.391 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.392 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.392 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.392 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.393 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.393 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.393 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.394 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.394 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.394 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.395 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.395 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.395 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.396 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.396 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.396 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.397 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.397 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.397 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.398 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.398 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.398 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.399 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.399 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.400 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.400 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.401 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.401 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.401 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.402 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.402 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.403 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.403 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.403 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.404 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.404 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.404 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.405 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.405 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.406 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.406 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.406 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.407 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.407 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.407 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.407 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.408 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.408 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.408 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.409 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.409 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.409 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.409 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.410 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.410 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.410 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.411 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.411 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.411 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.411 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.412 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.412 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.412 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.413 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.413 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.413 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.413 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.414 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.414 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.414 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.414 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.415 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.415 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.415 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.416 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.416 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.416 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.416 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.417 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.417 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.417 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.418 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.418 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.418 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.418 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.419 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.419 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.419 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.420 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.420 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.420 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.420 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.421 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.421 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.421 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.421 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.422 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.422 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.422 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.422 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.423 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.423 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.423 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.423 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.424 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.424 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.424 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.424 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.425 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.425 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.425 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.425 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.426 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.426 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.426 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.426 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.427 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.427 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.427 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.427 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.428 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.428 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.428 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.429 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.429 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.429 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.429 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.430 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.430 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.430 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.430 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.431 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.431 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.431 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.431 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.431 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.432 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.432 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.432 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.433 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.433 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.433 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.433 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.434 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.434 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.434 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.434 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.435 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.435 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.435 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.435 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.436 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.436 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.436 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.436 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.437 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.437 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.437 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.437 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.438 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.438 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.438 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.438 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.439 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.439 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.439 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.439 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.440 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.440 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.440 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.440 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.441 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.441 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.441 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.442 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.442 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.442 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.442 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.442 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.443 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.443 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.443 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.443 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.443 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.443 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.443 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.444 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.444 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.444 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.444 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.444 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.444 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.445 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.445 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.445 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.445 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.445 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.445 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.446 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.446 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.446 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.446 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.446 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.446 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.446 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.447 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.447 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.447 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.447 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.447 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.447 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.448 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.448 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.448 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.448 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.448 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.448 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.448 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.449 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.449 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.449 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.449 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.449 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.449 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.449 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.450 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.450 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.450 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.450 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.450 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.451 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.451 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.451 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.451 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.451 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.451 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.452 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.452 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.452 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.452 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.452 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.452 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.452 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.453 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.453 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.453 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.453 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.453 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.454 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.454 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.454 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.454 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.454 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.455 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.455 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.455 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.455 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.455 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.455 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.455 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.456 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.456 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.456 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.456 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.456 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.456 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.456 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.457 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.457 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.457 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.457 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.457 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.458 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.458 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.458 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.458 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.458 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.458 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.458 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.459 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.459 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.459 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.459 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.459 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.459 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.459 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.460 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.460 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.460 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.460 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.460 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.460 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.460 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.461 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.461 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.461 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.461 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.462 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.462 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.462 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.462 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.462 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.463 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.463 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.463 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.463 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.463 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.463 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.464 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.464 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.464 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.464 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.464 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.465 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.465 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.465 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.465 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.465 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.465 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.465 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.466 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.466 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.466 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.466 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.466 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.466 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.466 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.467 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.467 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.467 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.467 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.467 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.467 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.467 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.468 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.468 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.468 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.468 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.468 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.468 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.468 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.469 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.469 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.469 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.469 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.469 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.469 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.469 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.470 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.470 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.470 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.470 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.470 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.470 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.470 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.471 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.471 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.471 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.471 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.471 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.471 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.471 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.472 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.472 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.472 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.472 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.472 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.472 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.472 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.473 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.473 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.473 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.473 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.473 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.473 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.473 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.474 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.474 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.474 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.474 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.474 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.474 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.474 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.475 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.475 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.475 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.475 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.475 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.476 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.476 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.476 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.476 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.476 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.476 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.476 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.477 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.477 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.477 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.477 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.477 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.477 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.478 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.478 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.478 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.478 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.478 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.478 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.478 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.479 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.479 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.479 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.479 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.479 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.479 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.479 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.480 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.480 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.480 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.480 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.480 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.480 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.480 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.481 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.481 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.481 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.481 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.481 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.481 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.481 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.482 182938 WARNING oslo_config.cfg [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 31 01:25:23 np0005603500 nova_compute[182934]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 31 01:25:23 np0005603500 nova_compute[182934]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 31 01:25:23 np0005603500 nova_compute[182934]: and ``live_migration_inbound_addr`` respectively.
Jan 31 01:25:23 np0005603500 nova_compute[182934]: ).  Its value may be silently ignored in the future.
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.482 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.482 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.482 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.482 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.483 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.483 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.483 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.483 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.483 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.483 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.483 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.484 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.484 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.484 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.484 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.484 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.484 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.485 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.485 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.485 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.485 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.485 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.485 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.485 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.486 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.486 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.486 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.486 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.486 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.486 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.486 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.487 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.487 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.487 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.487 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.487 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.487 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.487 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.488 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.488 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.488 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.488 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.488 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.489 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.489 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.489 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.489 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.489 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.489 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.489 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.490 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.490 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.490 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.490 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.490 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.490 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.490 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.491 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.491 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.491 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.491 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.491 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.491 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.491 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.492 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.492 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.492 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.492 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.492 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.492 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.493 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.493 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.493 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.493 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.493 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.493 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.493 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.494 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.494 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.494 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.494 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.494 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.494 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.494 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.495 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.495 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.495 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.495 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.495 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.495 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.495 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.496 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.496 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.496 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.496 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.496 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.497 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.497 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.497 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.497 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.497 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.497 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.497 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.498 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.498 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.498 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.498 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.498 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.498 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.498 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.499 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.499 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.499 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.499 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.499 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.499 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.499 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.500 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.500 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.500 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.500 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.500 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.500 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.500 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.501 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.501 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.501 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.501 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.501 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.501 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.501 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.502 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.502 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.502 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.502 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.502 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.502 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.502 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.503 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.503 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.503 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.503 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.503 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.503 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.503 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.504 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.504 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.504 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.504 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.504 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.504 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.504 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.505 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.505 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.505 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.505 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.505 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.505 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.506 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.506 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.506 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.506 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.506 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.506 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.506 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.507 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.507 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.507 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.507 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.507 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.507 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.508 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.508 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.508 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.508 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.508 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.508 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.508 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.509 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.509 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.509 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.509 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.509 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.509 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.509 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.510 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.510 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.510 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.510 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.510 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.510 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.511 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.511 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.511 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.511 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.511 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.511 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.512 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.512 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.512 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.512 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.512 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.512 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.512 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.513 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.513 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.513 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.513 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.513 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.513 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.514 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.514 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.514 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.514 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.514 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.514 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.514 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.515 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.515 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.515 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.515 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.515 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.515 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.515 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.516 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.516 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.516 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.516 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.516 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.516 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.516 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.517 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.517 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.517 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.517 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.517 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.517 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.517 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.518 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.518 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.518 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.518 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.518 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.518 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.518 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.519 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.519 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.519 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.519 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.519 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.519 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.519 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.520 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.520 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.520 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.520 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.520 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.520 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.521 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.521 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.521 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.521 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.521 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.522 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.522 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.522 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.522 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.522 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.522 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.522 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.523 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.523 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.523 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.523 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.523 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.523 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.523 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.523 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.524 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.524 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.524 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.524 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.524 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.524 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.525 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.525 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.525 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.525 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.525 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.525 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.525 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.526 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.526 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.526 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.526 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.526 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.526 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.527 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.527 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.527 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.527 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.527 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.527 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.527 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.528 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.528 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.528 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.528 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.528 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.529 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.529 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.529 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.529 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.529 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.530 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.530 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.530 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.530 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.530 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.530 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.531 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.531 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.531 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.531 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.531 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.531 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.532 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.532 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.532 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.532 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.533 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.533 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.533 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.533 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.533 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.533 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.533 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.534 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.534 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.534 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.534 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.534 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.534 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.534 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.535 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.535 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.535 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.535 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.535 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.535 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.535 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.536 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.536 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.536 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.536 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.536 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.536 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.537 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.537 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.537 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.537 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.537 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.537 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.537 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.538 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.538 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.538 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.538 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.538 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.538 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.539 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.539 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.539 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.539 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.539 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.539 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.539 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.540 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.540 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.540 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.540 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.540 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.540 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.540 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.541 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.541 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.541 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.541 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.541 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.541 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.541 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.542 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.542 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.542 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.542 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.542 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.542 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.542 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.543 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.543 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.543 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.543 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.543 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.543 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.544 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.544 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.544 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.544 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.544 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.544 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.544 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.545 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.545 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.545 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.545 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.545 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.545 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.545 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.546 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.546 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.546 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.546 182938 DEBUG oslo_service.backend.eventlet.service [None req-ce11ed40-184c-4ef7-ad2f-c8a4e5dee080 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2828
Jan 31 01:25:23 np0005603500 nova_compute[182934]: 2026-01-31 06:25:23.547 182938 INFO nova.service [-] Starting compute node (version 31.1.0-0.20250428102727.3e7017e.el9)
Jan 31 01:25:24 np0005603500 nova_compute[182934]: 2026-01-31 06:25:24.382 182938 DEBUG nova.virt.libvirt.host [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:495
Jan 31 01:25:24 np0005603500 systemd[1]: Starting libvirt QEMU daemon...
Jan 31 01:25:24 np0005603500 systemd[1]: Started libvirt QEMU daemon.
Jan 31 01:25:24 np0005603500 nova_compute[182934]: 2026-01-31 06:25:24.465 182938 DEBUG nova.virt.libvirt.host [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f62eb716400> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:501
Jan 31 01:25:24 np0005603500 nova_compute[182934]: libvirt:  error : internal error: could not initialize domain event timer
Jan 31 01:25:24 np0005603500 nova_compute[182934]: 2026-01-31 06:25:24.467 182938 WARNING nova.virt.libvirt.host [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Jan 31 01:25:24 np0005603500 nova_compute[182934]: 2026-01-31 06:25:24.467 182938 DEBUG nova.virt.libvirt.host [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f62eb716400> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:522
Jan 31 01:25:24 np0005603500 nova_compute[182934]: 2026-01-31 06:25:24.470 182938 DEBUG nova.virt.libvirt.host [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:481
Jan 31 01:25:24 np0005603500 nova_compute[182934]: 2026-01-31 06:25:24.470 182938 DEBUG nova.virt.libvirt.host [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:487
Jan 31 01:25:24 np0005603500 nova_compute[182934]: 2026-01-31 06:25:24.471 182938 DEBUG nova.virt.libvirt.host [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:490
Jan 31 01:25:24 np0005603500 nova_compute[182934]: 2026-01-31 06:25:24.471 182938 INFO nova.virt.libvirt.driver [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Connection event '1' reason 'None'
Jan 31 01:25:25 np0005603500 nova_compute[182934]: 2026-01-31 06:25:25.250 182938 INFO nova.virt.libvirt.host [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Libvirt host capabilities <capabilities>
Jan 31 01:25:25 np0005603500 nova_compute[182934]: 
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <host>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <uuid>e984390d-4171-477a-ad02-535ae0fc7a74</uuid>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <cpu>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <arch>x86_64</arch>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model>EPYC-Rome-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <vendor>AMD</vendor>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <microcode version='16777317'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <signature family='23' model='49' stepping='0'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='x2apic'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='tsc-deadline'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='osxsave'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='hypervisor'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='tsc_adjust'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='spec-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='stibp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='arch-capabilities'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='ssbd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='cmp_legacy'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='topoext'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='virt-ssbd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='lbrv'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='tsc-scale'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='vmcb-clean'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='pause-filter'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='pfthreshold'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='svme-addr-chk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='rdctl-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='skip-l1dfl-vmentry'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='mds-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature name='pschange-mc-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <pages unit='KiB' size='4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <pages unit='KiB' size='2048'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <pages unit='KiB' size='1048576'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </cpu>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <power_management>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <suspend_mem/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <suspend_disk/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <suspend_hybrid/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </power_management>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <iommu support='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <migration_features>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <live/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <uri_transports>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <uri_transport>tcp</uri_transport>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <uri_transport>rdma</uri_transport>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </uri_transports>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </migration_features>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <topology>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <cells num='1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <cell id='0'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:          <memory unit='KiB'>7864300</memory>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:          <pages unit='KiB' size='4'>1966075</pages>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:          <pages unit='KiB' size='2048'>0</pages>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:          <distances>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:            <sibling id='0' value='10'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:          </distances>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:          <cpus num='8'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:          </cpus>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        </cell>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </cells>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </topology>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <cache>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </cache>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <secmodel>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model>selinux</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <doi>0</doi>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </secmodel>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <secmodel>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model>dac</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <doi>0</doi>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </secmodel>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </host>
Jan 31 01:25:25 np0005603500 nova_compute[182934]: 
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <guest>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <os_type>hvm</os_type>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <arch name='i686'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <wordsize>32</wordsize>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <domain type='qemu'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <domain type='kvm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </arch>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <features>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <pae/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <nonpae/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <acpi default='on' toggle='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <apic default='on' toggle='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <cpuselection/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <deviceboot/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <disksnapshot default='on' toggle='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <externalSnapshot/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </features>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </guest>
Jan 31 01:25:25 np0005603500 nova_compute[182934]: 
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <guest>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <os_type>hvm</os_type>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <arch name='x86_64'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <wordsize>64</wordsize>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <domain type='qemu'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <domain type='kvm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </arch>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <features>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <acpi default='on' toggle='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <apic default='on' toggle='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <cpuselection/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <deviceboot/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <disksnapshot default='on' toggle='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <externalSnapshot/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </features>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </guest>
Jan 31 01:25:25 np0005603500 nova_compute[182934]: 
Jan 31 01:25:25 np0005603500 nova_compute[182934]: </capabilities>
Jan 31 01:25:25 np0005603500 nova_compute[182934]: 
Jan 31 01:25:25 np0005603500 nova_compute[182934]: 2026-01-31 06:25:25.256 182938 DEBUG nova.virt.libvirt.host [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:941
Jan 31 01:25:25 np0005603500 nova_compute[182934]: 2026-01-31 06:25:25.273 182938 DEBUG nova.virt.libvirt.host [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 31 01:25:25 np0005603500 nova_compute[182934]: <domainCapabilities>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <path>/usr/libexec/qemu-kvm</path>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <domain>kvm</domain>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <arch>i686</arch>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <vcpu max='240'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <iothreads supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <os supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <enum name='firmware'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <loader supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='type'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>rom</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>pflash</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='readonly'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>yes</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>no</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='secure'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>no</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </loader>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <cpu>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <mode name='host-passthrough' supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='hostPassthroughMigratable'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>on</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>off</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </mode>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <mode name='maximum' supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='maximumMigratable'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>on</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>off</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </mode>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <mode name='host-model' supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <vendor>AMD</vendor>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='x2apic'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='tsc-deadline'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='hypervisor'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='tsc_adjust'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='spec-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='stibp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='ssbd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='cmp_legacy'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='overflow-recov'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='succor'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='amd-ssbd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='virt-ssbd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='lbrv'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='tsc-scale'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='vmcb-clean'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='flushbyasid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='pause-filter'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='pfthreshold'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='svme-addr-chk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='disable' name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </mode>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <mode name='custom' supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-noTSX'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-v5'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='ClearwaterForest'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ddpd-u'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='intel-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ipred-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='lam'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rrsba-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sha512'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sm3'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sm4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='ClearwaterForest-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ddpd-u'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='intel-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ipred-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='lam'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rrsba-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sha512'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sm3'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sm4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cooperlake'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cooperlake-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cooperlake-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Denverton'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mpx'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Denverton-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mpx'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Denverton-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Denverton-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Dhyana-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Genoa'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='auto-ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Genoa-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='auto-ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Genoa-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='auto-ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fs-gs-base-ns'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='perfmon-v2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Milan'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Milan-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Milan-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Milan-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Rome'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Rome-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Rome-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Rome-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Turin'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='auto-ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vp2intersect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fs-gs-base-ns'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibpb-brtype'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='perfmon-v2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbpb'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='srso-user-kernel-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Turin-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='auto-ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vp2intersect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fs-gs-base-ns'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibpb-brtype'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='perfmon-v2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbpb'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='srso-user-kernel-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-v5'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='GraniteRapids'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='GraniteRapids-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='GraniteRapids-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-128'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-256'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-512'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='GraniteRapids-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-128'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-256'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-512'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-noTSX'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-noTSX'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v5'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v6'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v7'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='IvyBridge'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='IvyBridge-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='IvyBridge-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='IvyBridge-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='KnightsMill'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-4fmaps'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-4vnniw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512er'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512pf'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='KnightsMill-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-4fmaps'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-4vnniw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512er'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512pf'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Opteron_G4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fma4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xop'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Opteron_G4-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fma4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xop'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Opteron_G5'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fma4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tbm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xop'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Opteron_G5-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fma4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tbm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xop'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SapphireRapids'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SapphireRapids-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SapphireRapids-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SapphireRapids-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SapphireRapids-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SierraForest'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SierraForest-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SierraForest-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='intel-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ipred-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='lam'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rrsba-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SierraForest-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='intel-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ipred-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='lam'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rrsba-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-v5'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Snowridge'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='core-capability'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mpx'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='split-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Snowridge-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='core-capability'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mpx'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='split-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Snowridge-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='core-capability'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='split-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Snowridge-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='core-capability'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='split-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Snowridge-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='athlon'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnow'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnowext'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='athlon-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnow'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnowext'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='core2duo'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='core2duo-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='coreduo'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='coreduo-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='n270'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='n270-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='phenom'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnow'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnowext'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='phenom-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnow'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnowext'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </mode>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <memoryBacking supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <enum name='sourceType'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <value>file</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <value>anonymous</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <value>memfd</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </memoryBacking>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <disk supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='diskDevice'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>disk</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>cdrom</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>floppy</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>lun</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='bus'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>ide</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>fdc</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>scsi</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>usb</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>sata</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='model'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio-transitional</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio-non-transitional</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <graphics supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='type'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vnc</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>egl-headless</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>dbus</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </graphics>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <video supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='modelType'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vga</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>cirrus</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>none</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>bochs</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>ramfb</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <hostdev supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='mode'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>subsystem</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='startupPolicy'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>default</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>mandatory</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>requisite</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>optional</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='subsysType'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>usb</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>pci</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>scsi</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='capsType'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='pciBackend'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </hostdev>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <rng supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='model'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio-transitional</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio-non-transitional</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='backendModel'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>random</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>egd</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>builtin</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <filesystem supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='driverType'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>path</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>handle</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtiofs</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </filesystem>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <tpm supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='model'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>tpm-tis</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>tpm-crb</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='backendModel'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>emulator</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>external</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='backendVersion'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>2.0</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </tpm>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <redirdev supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='bus'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>usb</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </redirdev>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <channel supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='type'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>pty</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>unix</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </channel>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <crypto supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='model'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='type'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>qemu</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='backendModel'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>builtin</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </crypto>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <interface supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='backendType'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>default</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>passt</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <panic supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='model'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>isa</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>hyperv</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </panic>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <console supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='type'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>null</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vc</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>pty</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>dev</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>file</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>pipe</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>stdio</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>udp</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>tcp</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>unix</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>qemu-vdagent</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>dbus</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </console>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <gic supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <vmcoreinfo supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <genid supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <backingStoreInput supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <backup supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <async-teardown supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <s390-pv supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <ps2 supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <tdx supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <sev supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <sgx supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <hyperv supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='features'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>relaxed</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vapic</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>spinlocks</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vpindex</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>runtime</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>synic</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>stimer</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>reset</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vendor_id</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>frequencies</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>reenlightenment</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>tlbflush</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>ipi</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>avic</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>emsr_bitmap</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>xmm_input</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <defaults>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <spinlocks>4095</spinlocks>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <stimer_direct>on</stimer_direct>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <tlbflush_direct>on</tlbflush_direct>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <tlbflush_extended>on</tlbflush_extended>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </defaults>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </hyperv>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <launchSecurity supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:25:25 np0005603500 nova_compute[182934]: </domainCapabilities>
Jan 31 01:25:25 np0005603500 nova_compute[182934]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1026
Jan 31 01:25:25 np0005603500 nova_compute[182934]: 2026-01-31 06:25:25.282 182938 DEBUG nova.virt.libvirt.host [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 31 01:25:25 np0005603500 nova_compute[182934]: <domainCapabilities>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <path>/usr/libexec/qemu-kvm</path>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <domain>kvm</domain>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <arch>i686</arch>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <vcpu max='4096'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <iothreads supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <os supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <enum name='firmware'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <loader supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='type'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>rom</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>pflash</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='readonly'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>yes</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>no</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='secure'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>no</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </loader>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <cpu>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <mode name='host-passthrough' supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='hostPassthroughMigratable'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>on</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>off</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </mode>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <mode name='maximum' supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='maximumMigratable'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>on</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>off</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </mode>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <mode name='host-model' supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <vendor>AMD</vendor>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='x2apic'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='tsc-deadline'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='hypervisor'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='tsc_adjust'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='spec-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='stibp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='ssbd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='cmp_legacy'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='overflow-recov'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='succor'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='amd-ssbd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='virt-ssbd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='lbrv'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='tsc-scale'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='vmcb-clean'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='flushbyasid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='pause-filter'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='pfthreshold'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='svme-addr-chk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='disable' name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </mode>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <mode name='custom' supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-noTSX'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-v5'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='ClearwaterForest'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ddpd-u'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='intel-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ipred-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='lam'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rrsba-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sha512'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sm3'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sm4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='ClearwaterForest-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ddpd-u'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='intel-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ipred-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='lam'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rrsba-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sha512'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sm3'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sm4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cooperlake'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cooperlake-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cooperlake-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Denverton'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mpx'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Denverton-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mpx'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Denverton-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Denverton-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Dhyana-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Genoa'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='auto-ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Genoa-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='auto-ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Genoa-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='auto-ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fs-gs-base-ns'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='perfmon-v2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Milan'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Milan-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Milan-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Milan-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Rome'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Rome-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Rome-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Rome-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Turin'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='auto-ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vp2intersect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fs-gs-base-ns'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibpb-brtype'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='perfmon-v2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbpb'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='srso-user-kernel-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Turin-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='auto-ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vp2intersect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fs-gs-base-ns'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibpb-brtype'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='perfmon-v2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbpb'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='srso-user-kernel-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-v5'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='GraniteRapids'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='GraniteRapids-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='GraniteRapids-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-128'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-256'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-512'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='GraniteRapids-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-128'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-256'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-512'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-noTSX'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-noTSX'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v5'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v6'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v7'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='IvyBridge'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='IvyBridge-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='IvyBridge-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='IvyBridge-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='KnightsMill'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-4fmaps'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-4vnniw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512er'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512pf'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='KnightsMill-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-4fmaps'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-4vnniw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512er'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512pf'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Opteron_G4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fma4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xop'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Opteron_G4-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fma4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xop'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Opteron_G5'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fma4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tbm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xop'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Opteron_G5-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fma4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tbm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xop'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SapphireRapids'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SapphireRapids-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SapphireRapids-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SapphireRapids-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SapphireRapids-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SierraForest'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SierraForest-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SierraForest-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='intel-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ipred-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='lam'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rrsba-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SierraForest-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='intel-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ipred-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='lam'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rrsba-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-v5'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Snowridge'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='core-capability'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mpx'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='split-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Snowridge-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='core-capability'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mpx'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='split-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Snowridge-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='core-capability'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='split-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Snowridge-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='core-capability'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='split-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Snowridge-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='athlon'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnow'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnowext'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='athlon-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnow'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnowext'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='core2duo'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='core2duo-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='coreduo'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='coreduo-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='n270'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='n270-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='phenom'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnow'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnowext'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='phenom-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnow'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnowext'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </mode>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <memoryBacking supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <enum name='sourceType'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <value>file</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <value>anonymous</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <value>memfd</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </memoryBacking>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <disk supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='diskDevice'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>disk</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>cdrom</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>floppy</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>lun</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='bus'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>fdc</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>scsi</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>usb</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>sata</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='model'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio-transitional</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio-non-transitional</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <graphics supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='type'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vnc</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>egl-headless</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>dbus</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </graphics>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <video supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='modelType'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vga</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>cirrus</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>none</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>bochs</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>ramfb</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <hostdev supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='mode'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>subsystem</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='startupPolicy'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>default</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>mandatory</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>requisite</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>optional</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='subsysType'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>usb</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>pci</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>scsi</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='capsType'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='pciBackend'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </hostdev>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <rng supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='model'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio-transitional</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio-non-transitional</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='backendModel'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>random</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>egd</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>builtin</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <filesystem supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='driverType'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>path</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>handle</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtiofs</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </filesystem>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <tpm supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='model'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>tpm-tis</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>tpm-crb</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='backendModel'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>emulator</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>external</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='backendVersion'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>2.0</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </tpm>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <redirdev supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='bus'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>usb</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </redirdev>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <channel supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='type'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>pty</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>unix</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </channel>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <crypto supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='model'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='type'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>qemu</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='backendModel'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>builtin</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </crypto>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <interface supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='backendType'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>default</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>passt</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <panic supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='model'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>isa</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>hyperv</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </panic>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <console supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='type'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>null</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vc</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>pty</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>dev</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>file</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>pipe</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>stdio</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>udp</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>tcp</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>unix</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>qemu-vdagent</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>dbus</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </console>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <gic supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <vmcoreinfo supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <genid supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <backingStoreInput supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <backup supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <async-teardown supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <s390-pv supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <ps2 supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <tdx supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <sev supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <sgx supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <hyperv supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='features'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>relaxed</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vapic</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>spinlocks</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vpindex</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>runtime</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>synic</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>stimer</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>reset</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vendor_id</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>frequencies</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>reenlightenment</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>tlbflush</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>ipi</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>avic</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>emsr_bitmap</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>xmm_input</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <defaults>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <spinlocks>4095</spinlocks>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <stimer_direct>on</stimer_direct>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <tlbflush_direct>on</tlbflush_direct>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <tlbflush_extended>on</tlbflush_extended>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </defaults>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </hyperv>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <launchSecurity supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:25:25 np0005603500 nova_compute[182934]: </domainCapabilities>
Jan 31 01:25:25 np0005603500 nova_compute[182934]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1026
Jan 31 01:25:25 np0005603500 nova_compute[182934]: 2026-01-31 06:25:25.333 182938 DEBUG nova.virt.libvirt.host [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:941
Jan 31 01:25:25 np0005603500 nova_compute[182934]: 2026-01-31 06:25:25.338 182938 DEBUG nova.virt.libvirt.host [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 31 01:25:25 np0005603500 nova_compute[182934]: <domainCapabilities>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <path>/usr/libexec/qemu-kvm</path>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <domain>kvm</domain>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <arch>x86_64</arch>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <vcpu max='4096'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <iothreads supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <os supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <enum name='firmware'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <value>efi</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <loader supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='type'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>rom</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>pflash</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='readonly'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>yes</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>no</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='secure'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>yes</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>no</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </loader>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <cpu>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <mode name='host-passthrough' supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='hostPassthroughMigratable'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>on</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>off</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </mode>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <mode name='maximum' supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='maximumMigratable'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>on</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>off</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </mode>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <mode name='host-model' supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <vendor>AMD</vendor>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='x2apic'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='tsc-deadline'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='hypervisor'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='tsc_adjust'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='spec-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='stibp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='ssbd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='cmp_legacy'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='overflow-recov'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='succor'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='amd-ssbd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='virt-ssbd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='lbrv'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='tsc-scale'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='vmcb-clean'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='flushbyasid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='pause-filter'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='pfthreshold'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='svme-addr-chk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='disable' name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </mode>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <mode name='custom' supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-noTSX'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-v5'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='ClearwaterForest'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ddpd-u'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='intel-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ipred-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='lam'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rrsba-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sha512'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sm3'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sm4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='ClearwaterForest-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ddpd-u'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='intel-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ipred-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='lam'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rrsba-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sha512'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sm3'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sm4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cooperlake'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cooperlake-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cooperlake-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Denverton'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mpx'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Denverton-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mpx'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Denverton-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Denverton-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Dhyana-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Genoa'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='auto-ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Genoa-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='auto-ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Genoa-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='auto-ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fs-gs-base-ns'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='perfmon-v2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Milan'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Milan-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Milan-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Milan-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Rome'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Rome-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Rome-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Rome-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Turin'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='auto-ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vp2intersect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fs-gs-base-ns'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibpb-brtype'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='perfmon-v2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbpb'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='srso-user-kernel-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Turin-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='auto-ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vp2intersect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fs-gs-base-ns'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibpb-brtype'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='perfmon-v2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbpb'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='srso-user-kernel-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-v5'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='GraniteRapids'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='GraniteRapids-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='GraniteRapids-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-128'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-256'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-512'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='GraniteRapids-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-128'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-256'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-512'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-noTSX'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-noTSX'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v5'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v6'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v7'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='IvyBridge'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='IvyBridge-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='IvyBridge-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='IvyBridge-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='KnightsMill'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-4fmaps'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-4vnniw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512er'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512pf'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='KnightsMill-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-4fmaps'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-4vnniw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512er'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512pf'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Opteron_G4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fma4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xop'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Opteron_G4-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fma4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xop'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Opteron_G5'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fma4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tbm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xop'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Opteron_G5-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fma4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tbm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xop'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SapphireRapids'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SapphireRapids-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SapphireRapids-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SapphireRapids-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SapphireRapids-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SierraForest'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SierraForest-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SierraForest-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='intel-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ipred-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='lam'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rrsba-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SierraForest-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='intel-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ipred-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='lam'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rrsba-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-v5'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Snowridge'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='core-capability'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mpx'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='split-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Snowridge-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='core-capability'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mpx'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='split-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Snowridge-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='core-capability'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='split-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Snowridge-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='core-capability'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='split-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Snowridge-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='athlon'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnow'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnowext'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='athlon-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnow'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnowext'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='core2duo'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='core2duo-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='coreduo'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='coreduo-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='n270'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='n270-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='phenom'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnow'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnowext'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='phenom-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnow'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnowext'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </mode>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <memoryBacking supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <enum name='sourceType'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <value>file</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <value>anonymous</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <value>memfd</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </memoryBacking>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <disk supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='diskDevice'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>disk</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>cdrom</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>floppy</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>lun</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='bus'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>fdc</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>scsi</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>usb</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>sata</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='model'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio-transitional</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio-non-transitional</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <graphics supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='type'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vnc</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>egl-headless</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>dbus</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </graphics>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <video supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='modelType'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vga</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>cirrus</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>none</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>bochs</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>ramfb</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <hostdev supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='mode'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>subsystem</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='startupPolicy'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>default</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>mandatory</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>requisite</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>optional</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='subsysType'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>usb</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>pci</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>scsi</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='capsType'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='pciBackend'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </hostdev>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <rng supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='model'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio-transitional</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio-non-transitional</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='backendModel'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>random</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>egd</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>builtin</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <filesystem supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='driverType'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>path</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>handle</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtiofs</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </filesystem>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <tpm supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='model'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>tpm-tis</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>tpm-crb</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='backendModel'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>emulator</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>external</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='backendVersion'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>2.0</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </tpm>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <redirdev supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='bus'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>usb</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </redirdev>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <channel supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='type'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>pty</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>unix</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </channel>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <crypto supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='model'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='type'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>qemu</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='backendModel'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>builtin</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </crypto>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <interface supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='backendType'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>default</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>passt</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <panic supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='model'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>isa</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>hyperv</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </panic>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <console supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='type'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>null</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vc</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>pty</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>dev</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>file</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>pipe</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>stdio</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>udp</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>tcp</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>unix</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>qemu-vdagent</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>dbus</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </console>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <gic supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <vmcoreinfo supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <genid supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <backingStoreInput supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <backup supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <async-teardown supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <s390-pv supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <ps2 supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <tdx supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <sev supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <sgx supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <hyperv supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='features'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>relaxed</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vapic</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>spinlocks</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vpindex</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>runtime</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>synic</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>stimer</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>reset</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vendor_id</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>frequencies</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>reenlightenment</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>tlbflush</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>ipi</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>avic</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>emsr_bitmap</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>xmm_input</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <defaults>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <spinlocks>4095</spinlocks>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <stimer_direct>on</stimer_direct>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <tlbflush_direct>on</tlbflush_direct>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <tlbflush_extended>on</tlbflush_extended>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </defaults>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </hyperv>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <launchSecurity supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:25:25 np0005603500 nova_compute[182934]: </domainCapabilities>
Jan 31 01:25:25 np0005603500 nova_compute[182934]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1026
Jan 31 01:25:25 np0005603500 nova_compute[182934]: 2026-01-31 06:25:25.422 182938 DEBUG nova.virt.libvirt.host [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 31 01:25:25 np0005603500 nova_compute[182934]: <domainCapabilities>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <path>/usr/libexec/qemu-kvm</path>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <domain>kvm</domain>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <arch>x86_64</arch>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <vcpu max='240'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <iothreads supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <os supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <enum name='firmware'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <loader supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='type'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>rom</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>pflash</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='readonly'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>yes</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>no</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='secure'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>no</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </loader>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <cpu>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <mode name='host-passthrough' supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='hostPassthroughMigratable'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>on</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>off</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </mode>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <mode name='maximum' supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='maximumMigratable'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>on</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>off</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </mode>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <mode name='host-model' supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <vendor>AMD</vendor>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='x2apic'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='tsc-deadline'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='hypervisor'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='tsc_adjust'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='spec-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='stibp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='ssbd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='cmp_legacy'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='overflow-recov'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='succor'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='amd-ssbd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='virt-ssbd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='lbrv'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='tsc-scale'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='vmcb-clean'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='flushbyasid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='pause-filter'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='pfthreshold'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='svme-addr-chk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <feature policy='disable' name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </mode>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <mode name='custom' supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-noTSX'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Broadwell-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cascadelake-Server-v5'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='ClearwaterForest'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ddpd-u'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='intel-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ipred-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='lam'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rrsba-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sha512'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sm3'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sm4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='ClearwaterForest-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ddpd-u'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='intel-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ipred-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='lam'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rrsba-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sha512'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sm3'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sm4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cooperlake'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cooperlake-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Cooperlake-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Denverton'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mpx'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Denverton-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mpx'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Denverton-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Denverton-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Dhyana-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Genoa'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='auto-ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Genoa-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='auto-ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Genoa-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='auto-ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fs-gs-base-ns'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='perfmon-v2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Milan'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Milan-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Milan-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Milan-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Rome'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Rome-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Rome-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Rome-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Turin'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='auto-ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vp2intersect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fs-gs-base-ns'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibpb-brtype'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='perfmon-v2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbpb'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='srso-user-kernel-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-Turin-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amd-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='auto-ibrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vp2intersect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fs-gs-base-ns'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibpb-brtype'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='no-nested-data-bp'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='null-sel-clr-base'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='perfmon-v2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbpb'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='srso-user-kernel-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='stibp-always-on'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='EPYC-v5'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='GraniteRapids'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='GraniteRapids-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='GraniteRapids-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-128'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-256'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-512'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='GraniteRapids-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-128'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-256'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx10-512'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='prefetchiti'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-noTSX'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Haswell-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-noTSX'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 systemd[1]: Starting libvirt nodedev daemon...
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v5'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v6'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Icelake-Server-v7'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='IvyBridge'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='IvyBridge-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='IvyBridge-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='IvyBridge-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='KnightsMill'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-4fmaps'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-4vnniw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512er'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512pf'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='KnightsMill-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-4fmaps'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-4vnniw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512er'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512pf'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Opteron_G4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fma4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xop'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Opteron_G4-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fma4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xop'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Opteron_G5'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fma4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tbm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xop'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Opteron_G5-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fma4'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tbm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xop'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SapphireRapids'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SapphireRapids-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SapphireRapids-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SapphireRapids-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SapphireRapids-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='amx-tile'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-bf16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-fp16'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512-vpopcntdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bitalg'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vbmi2'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrc'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fzrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='la57'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='taa-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='tsx-ldtrk'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SierraForest'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SierraForest-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 systemd[1]: Started libvirt nodedev daemon.
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SierraForest-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='intel-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ipred-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='lam'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rrsba-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='SierraForest-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ifma'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-ne-convert'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx-vnni-int8'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bhi-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='bus-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cmpccxadd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fbsdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='fsrs'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ibrs-all'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='intel-psfd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ipred-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='lam'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mcdt-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pbrsb-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='psdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rrsba-ctrl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='sbdr-ssdp-no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='serialize'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vaes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='vpclmulqdq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Client-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='hle'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='rtm'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Skylake-Server-v5'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512bw'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512cd'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512dq'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512f'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='avx512vl'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='invpcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pcid'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='pku'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Snowridge'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='core-capability'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mpx'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='split-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Snowridge-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='core-capability'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='mpx'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='split-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Snowridge-v2'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='core-capability'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='split-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Snowridge-v3'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='core-capability'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='split-lock-detect'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='Snowridge-v4'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='cldemote'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='erms'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='gfni'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdir64b'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='movdiri'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='xsaves'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='athlon'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnow'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnowext'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='athlon-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnow'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnowext'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='core2duo'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='core2duo-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='coreduo'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='coreduo-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='n270'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='n270-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='ss'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='phenom'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnow'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnowext'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <blockers model='phenom-v1'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnow'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <feature name='3dnowext'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </blockers>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </mode>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <memoryBacking supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <enum name='sourceType'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <value>file</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <value>anonymous</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <value>memfd</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </memoryBacking>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <disk supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='diskDevice'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>disk</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>cdrom</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>floppy</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>lun</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='bus'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>ide</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>fdc</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>scsi</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>usb</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>sata</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='model'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio-transitional</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio-non-transitional</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <graphics supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='type'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vnc</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>egl-headless</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>dbus</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </graphics>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <video supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='modelType'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vga</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>cirrus</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>none</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>bochs</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>ramfb</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <hostdev supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='mode'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>subsystem</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='startupPolicy'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>default</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>mandatory</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>requisite</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>optional</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='subsysType'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>usb</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>pci</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>scsi</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='capsType'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='pciBackend'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </hostdev>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <rng supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='model'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio-transitional</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtio-non-transitional</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='backendModel'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>random</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>egd</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>builtin</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <filesystem supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='driverType'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>path</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>handle</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>virtiofs</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </filesystem>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <tpm supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='model'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>tpm-tis</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>tpm-crb</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='backendModel'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>emulator</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>external</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='backendVersion'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>2.0</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </tpm>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <redirdev supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='bus'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>usb</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </redirdev>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <channel supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='type'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>pty</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>unix</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </channel>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <crypto supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='model'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='type'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>qemu</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='backendModel'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>builtin</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </crypto>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <interface supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='backendType'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>default</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>passt</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <panic supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='model'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>isa</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>hyperv</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </panic>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <console supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='type'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>null</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vc</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>pty</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>dev</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>file</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>pipe</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>stdio</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>udp</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>tcp</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>unix</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>qemu-vdagent</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>dbus</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </console>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <gic supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <vmcoreinfo supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <genid supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <backingStoreInput supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <backup supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <async-teardown supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <s390-pv supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <ps2 supported='yes'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <tdx supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <sev supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <sgx supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <hyperv supported='yes'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <enum name='features'>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>relaxed</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vapic</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>spinlocks</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vpindex</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>runtime</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>synic</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>stimer</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>reset</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>vendor_id</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>frequencies</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>reenlightenment</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>tlbflush</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>ipi</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>avic</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>emsr_bitmap</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <value>xmm_input</value>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </enum>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      <defaults>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <spinlocks>4095</spinlocks>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <stimer_direct>on</stimer_direct>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <tlbflush_direct>on</tlbflush_direct>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <tlbflush_extended>on</tlbflush_extended>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:      </defaults>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    </hyperv>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:    <launchSecurity supported='no'/>
Jan 31 01:25:25 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:25:25 np0005603500 nova_compute[182934]: </domainCapabilities>
Jan 31 01:25:25 np0005603500 nova_compute[182934]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1026
Jan 31 01:25:25 np0005603500 nova_compute[182934]: 2026-01-31 06:25:25.488 182938 DEBUG nova.virt.libvirt.host [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1874
Jan 31 01:25:25 np0005603500 nova_compute[182934]: 2026-01-31 06:25:25.489 182938 INFO nova.virt.libvirt.host [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Secure Boot support detected
Jan 31 01:25:25 np0005603500 nova_compute[182934]: 2026-01-31 06:25:25.494 182938 INFO nova.virt.libvirt.driver [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 31 01:25:25 np0005603500 nova_compute[182934]: 2026-01-31 06:25:25.495 182938 INFO nova.virt.libvirt.driver [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 31 01:25:25 np0005603500 nova_compute[182934]: 2026-01-31 06:25:25.628 182938 DEBUG nova.virt.libvirt.driver [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1197
Jan 31 01:25:25 np0005603500 nova_compute[182934]: 2026-01-31 06:25:25.730 182938 WARNING nova.virt.libvirt.driver [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 31 01:25:25 np0005603500 nova_compute[182934]: 2026-01-31 06:25:25.731 182938 DEBUG nova.virt.libvirt.volume.mount [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 31 01:25:26 np0005603500 nova_compute[182934]: 2026-01-31 06:25:26.163 182938 INFO nova.virt.node [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Determined node identity b70e363b-8d1d-4e70-9fa4-9b0009536a59 from /var/lib/nova/compute_id
Jan 31 01:25:26 np0005603500 nova_compute[182934]: 2026-01-31 06:25:26.679 182938 WARNING nova.compute.manager [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Compute nodes ['b70e363b-8d1d-4e70-9fa4-9b0009536a59'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Jan 31 01:25:27 np0005603500 systemd-logind[821]: New session 25 of user zuul.
Jan 31 01:25:27 np0005603500 systemd[1]: Started Session 25 of User zuul.
Jan 31 01:25:27 np0005603500 nova_compute[182934]: 2026-01-31 06:25:27.746 182938 INFO nova.compute.manager [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 31 01:25:28 np0005603500 python3.9[183456]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:25:28 np0005603500 nova_compute[182934]: 2026-01-31 06:25:28.778 182938 WARNING nova.compute.manager [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 31 01:25:28 np0005603500 nova_compute[182934]: 2026-01-31 06:25:28.778 182938 DEBUG oslo_concurrency.lockutils [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:25:28 np0005603500 nova_compute[182934]: 2026-01-31 06:25:28.778 182938 DEBUG oslo_concurrency.lockutils [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:25:28 np0005603500 nova_compute[182934]: 2026-01-31 06:25:28.779 182938 DEBUG oslo_concurrency.lockutils [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:25:28 np0005603500 nova_compute[182934]: 2026-01-31 06:25:28.779 182938 DEBUG nova.compute.resource_tracker [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:25:28 np0005603500 nova_compute[182934]: 2026-01-31 06:25:28.895 182938 WARNING nova.virt.libvirt.driver [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:25:28 np0005603500 nova_compute[182934]: 2026-01-31 06:25:28.896 182938 DEBUG nova.compute.resource_tracker [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6160MB free_disk=73.43303680419922GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:25:28 np0005603500 nova_compute[182934]: 2026-01-31 06:25:28.896 182938 DEBUG oslo_concurrency.lockutils [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:25:28 np0005603500 nova_compute[182934]: 2026-01-31 06:25:28.896 182938 DEBUG oslo_concurrency.lockutils [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:25:29 np0005603500 podman[183584]: 2026-01-31 06:25:29.304362723 +0000 UTC m=+0.054228833 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:25:29 np0005603500 nova_compute[182934]: 2026-01-31 06:25:29.415 182938 WARNING nova.compute.resource_tracker [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] No compute node record for compute-0.ctlplane.example.com:b70e363b-8d1d-4e70-9fa4-9b0009536a59: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host b70e363b-8d1d-4e70-9fa4-9b0009536a59 could not be found.
Jan 31 01:25:29 np0005603500 python3.9[183631]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 01:25:29 np0005603500 systemd[1]: Reloading.
Jan 31 01:25:29 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:25:29 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:25:29 np0005603500 nova_compute[182934]: 2026-01-31 06:25:29.935 182938 INFO nova.compute.resource_tracker [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: b70e363b-8d1d-4e70-9fa4-9b0009536a59
Jan 31 01:25:30 np0005603500 python3.9[183816]: ansible-ansible.builtin.service_facts Invoked
Jan 31 01:25:31 np0005603500 network[183833]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 01:25:31 np0005603500 network[183834]: 'network-scripts' will be removed from distribution in near future.
Jan 31 01:25:31 np0005603500 network[183835]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 01:25:31 np0005603500 nova_compute[182934]: 2026-01-31 06:25:31.585 182938 DEBUG nova.compute.resource_tracker [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:25:31 np0005603500 nova_compute[182934]: 2026-01-31 06:25:31.585 182938 DEBUG nova.compute.resource_tracker [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:25:33 np0005603500 nova_compute[182934]: 2026-01-31 06:25:33.030 182938 INFO nova.scheduler.client.report [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] [req-9070890d-846e-4d19-9783-2b9245c0a04c] Created resource provider record via placement API for resource provider with UUID b70e363b-8d1d-4e70-9fa4-9b0009536a59 and name compute-0.ctlplane.example.com.
Jan 31 01:25:33 np0005603500 nova_compute[182934]: 2026-01-31 06:25:33.726 182938 DEBUG nova.virt.libvirt.host [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 31 01:25:33 np0005603500 nova_compute[182934]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1946
Jan 31 01:25:33 np0005603500 nova_compute[182934]: 2026-01-31 06:25:33.727 182938 INFO nova.virt.libvirt.host [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] kernel doesn't support AMD SEV
Jan 31 01:25:33 np0005603500 nova_compute[182934]: 2026-01-31 06:25:33.727 182938 DEBUG nova.compute.provider_tree [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Updating inventory in ProviderTree for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 31 01:25:33 np0005603500 nova_compute[182934]: 2026-01-31 06:25:33.728 182938 DEBUG nova.virt.libvirt.driver [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Jan 31 01:25:34 np0005603500 nova_compute[182934]: 2026-01-31 06:25:34.542 182938 DEBUG nova.scheduler.client.report [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Updated inventory for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:975
Jan 31 01:25:34 np0005603500 nova_compute[182934]: 2026-01-31 06:25:34.543 182938 DEBUG nova.compute.provider_tree [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Updating resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 31 01:25:34 np0005603500 nova_compute[182934]: 2026-01-31 06:25:34.543 182938 DEBUG nova.compute.provider_tree [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Updating inventory in ProviderTree for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 31 01:25:34 np0005603500 nova_compute[182934]: 2026-01-31 06:25:34.755 182938 DEBUG nova.compute.provider_tree [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Updating resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 31 01:25:35 np0005603500 nova_compute[182934]: 2026-01-31 06:25:35.294 182938 DEBUG nova.compute.resource_tracker [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:25:35 np0005603500 nova_compute[182934]: 2026-01-31 06:25:35.295 182938 DEBUG oslo_concurrency.lockutils [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 6.399s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:25:35 np0005603500 nova_compute[182934]: 2026-01-31 06:25:35.295 182938 DEBUG nova.service [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:177
Jan 31 01:25:35 np0005603500 python3.9[184107]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:25:36 np0005603500 python3.9[184260]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:25:36 np0005603500 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 01:25:36 np0005603500 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 01:25:36 np0005603500 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 01:25:36 np0005603500 python3.9[184413]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:25:37 np0005603500 python3.9[184565]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:25:37 np0005603500 nova_compute[182934]: 2026-01-31 06:25:37.666 182938 DEBUG nova.service [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:194
Jan 31 01:25:37 np0005603500 nova_compute[182934]: 2026-01-31 06:25:37.667 182938 DEBUG nova.servicegroup.drivers.db [None req-da67f9d4-518a-4d99-b0e2-cfde667f9dc2 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Jan 31 01:25:38 np0005603500 python3.9[184717]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 31 01:25:38 np0005603500 python3.9[184869]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 01:25:38 np0005603500 systemd[1]: Reloading.
Jan 31 01:25:38 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:25:38 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:25:39 np0005603500 python3.9[185056]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:25:40 np0005603500 podman[185157]: 2026-01-31 06:25:40.189433413 +0000 UTC m=+0.098278803 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Jan 31 01:25:40 np0005603500 python3.9[185234]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:25:41 np0005603500 python3.9[185384]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:25:41 np0005603500 python3.9[185538]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Jan 31 01:25:42 np0005603500 python3.9[185690]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Jan 31 01:25:43 np0005603500 python3.9[185843]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 31 01:25:44 np0005603500 python3.9[186001]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 31 01:25:45 np0005603500 python3.9[186159]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:25:46 np0005603500 python3.9[186280]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769840745.1536086-181-271352723532620/.source.conf _original_basename=ceilometer.conf follow=False checksum=806b21daa538a66a80669be8bf74c414d178dfbc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:25:46 np0005603500 python3.9[186430]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:25:47 np0005603500 python3.9[186551]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769840746.2924511-181-128420747675325/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:25:47 np0005603500 python3.9[186701]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:25:48 np0005603500 python3.9[186822]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769840747.303526-181-239604350155491/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:25:48 np0005603500 python3.9[186972]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:25:49 np0005603500 python3.9[187124]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:25:50 np0005603500 python3.9[187276]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:25:50 np0005603500 python3.9[187397]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840749.6906347-240-113988819734699/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:25:51 np0005603500 python3.9[187547]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:25:51 np0005603500 python3.9[187668]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840751.0387251-240-183842184769432/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:25:52 np0005603500 python3.9[187818]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:25:53 np0005603500 python3.9[187939]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840752.166216-269-45116828449723/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:25:53 np0005603500 python3.9[188089]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:25:54 np0005603500 python3.9[188210]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840753.3006704-285-80766219341154/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:25:54 np0005603500 python3.9[188360]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:25:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:25:55.214 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:25:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:25:55.215 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:25:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:25:55.215 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:25:55 np0005603500 python3.9[188481]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840754.4681063-300-20407340109275/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:25:56 np0005603500 python3.9[188632]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:25:56 np0005603500 python3.9[188753]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840755.6125479-315-192958490141274/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:25:57 np0005603500 python3.9[188905]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:25:58 np0005603500 python3.9[189057]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:25:58 np0005603500 python3.9[189207]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:25:59 np0005603500 python3.9[189359]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:25:59 np0005603500 podman[189360]: 2026-01-31 06:25:59.470506758 +0000 UTC m=+0.055304118 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:25:59 np0005603500 python3.9[189530]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:26:00 np0005603500 python3.9[189684]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:26:01 np0005603500 python3.9[189836]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:26:01 np0005603500 systemd[1]: Reloading.
Jan 31 01:26:01 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:26:01 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:26:01 np0005603500 systemd[1]: Listening on Podman API Socket.
Jan 31 01:26:02 np0005603500 python3.9[190027]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:26:02 np0005603500 python3.9[190150]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840761.8039005-387-168256132038868/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:26:03 np0005603500 python3.9[190226]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:26:04 np0005603500 python3.9[190349]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840761.8039005-387-168256132038868/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:26:05 np0005603500 python3.9[190501]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:05 np0005603500 python3.9[190653]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:26:06 np0005603500 python3.9[190805]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:26:06 np0005603500 python3.9[190928]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840765.8938606-435-124618303853469/.source.json _original_basename=.xnbrus79 follow=False checksum=ce2b0c83293a970bafffa087afa083dd7c93a79c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:07 np0005603500 python3.9[191078]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:09 np0005603500 python3.9[191501]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_pattern=*.json debug=False
Jan 31 01:26:10 np0005603500 python3.9[191653]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 31 01:26:11 np0005603500 podman[191777]: 2026-01-31 06:26:11.167599977 +0000 UTC m=+0.098979934 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:26:11 np0005603500 python3[191818]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_id=ceilometer_agent_compute config_overrides={} config_patterns=*.json containers=['ceilometer_agent_compute'] log_base_path=/var/log/containers/stdouts debug=False
Jan 31 01:26:11 np0005603500 podman[191867]: 2026-01-31 06:26:11.540034735 +0000 UTC m=+0.020058372 image pull ccafe36535c9326f773051911bf7e736f46f05eea29aaa728ad791f05a9c5d70 quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:26:12 np0005603500 podman[191867]: 2026-01-31 06:26:12.025060369 +0000 UTC m=+0.505083916 container create 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, managed_by=edpm_ansible)
Jan 31 01:26:12 np0005603500 python3[191818]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck compute --label config_id=ceilometer_agent_compute --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0 kolla_start
Jan 31 01:26:12 np0005603500 python3.9[192057]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:26:13 np0005603500 python3.9[192211]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:13 np0005603500 python3.9[192287]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:26:14 np0005603500 python3.9[192438]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769840773.8219113-513-25996593803771/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:15 np0005603500 python3.9[192514]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 01:26:15 np0005603500 systemd[1]: Reloading.
Jan 31 01:26:15 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:26:15 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:26:16 np0005603500 python3.9[192625]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:26:16 np0005603500 systemd[1]: Reloading.
Jan 31 01:26:16 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:26:16 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:26:16 np0005603500 systemd[1]: Starting ceilometer_agent_compute container...
Jan 31 01:26:16 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:26:16 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0c4fa0311bc3ad22ca3c0a408345b670ae5898844ebcf06da688084d57b4691/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 31 01:26:16 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0c4fa0311bc3ad22ca3c0a408345b670ae5898844ebcf06da688084d57b4691/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Jan 31 01:26:16 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0c4fa0311bc3ad22ca3c0a408345b670ae5898844ebcf06da688084d57b4691/merged/var/lib/kolla/config_files/src supports timestamps until 2038 (0x7fffffff)
Jan 31 01:26:16 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0c4fa0311bc3ad22ca3c0a408345b670ae5898844ebcf06da688084d57b4691/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Jan 31 01:26:16 np0005603500 systemd[1]: Started /usr/bin/podman healthcheck run 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d.
Jan 31 01:26:16 np0005603500 podman[192665]: 2026-01-31 06:26:16.876061566 +0000 UTC m=+0.216741686 container init 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ceilometer_agent_compute)
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: + sudo -E kolla_set_configs
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: sudo: unable to send audit message: Operation not permitted
Jan 31 01:26:16 np0005603500 podman[192665]: 2026-01-31 06:26:16.918004506 +0000 UTC m=+0.258684606 container start 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: INFO:__main__:Validating config file
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: INFO:__main__:Copying service configuration files
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: INFO:__main__:Copying /var/lib/kolla/config_files/src/polling.yaml to /etc/ceilometer/polling.yaml
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: INFO:__main__:Copying /var/lib/kolla/config_files/src/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: INFO:__main__:Writing out command to execute
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: ++ cat /run_command
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: + ARGS=
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: + sudo kolla_copy_cacerts
Jan 31 01:26:16 np0005603500 podman[192665]: ceilometer_agent_compute
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: sudo: unable to send audit message: Operation not permitted
Jan 31 01:26:16 np0005603500 systemd[1]: Started ceilometer_agent_compute container.
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: + [[ ! -n '' ]]
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: + . kolla_extend_start
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: + umask 0022
Jan 31 01:26:16 np0005603500 ceilometer_agent_compute[192681]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Jan 31 01:26:17 np0005603500 podman[192688]: 2026-01-31 06:26:17.022233685 +0000 UTC m=+0.096857485 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 31 01:26:17 np0005603500 systemd[1]: 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d-47188e907a613813.service: Main process exited, code=exited, status=1/FAILURE
Jan 31 01:26:17 np0005603500 systemd[1]: 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d-47188e907a613813.service: Failed with result 'exit-code'.
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.747 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.747 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2804
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.747 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2805
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.748 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2806
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.748 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2807
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.748 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2809
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.748 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.748 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.748 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.748 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.748 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.748 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.749 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.749 2 DEBUG cotyledon.oslo_config_glue [-] enable_notifications           = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.749 2 DEBUG cotyledon.oslo_config_glue [-] enable_prometheus_exporter     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.749 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.749 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.749 2 DEBUG cotyledon.oslo_config_glue [-] heartbeat_socket_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.749 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.749 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.749 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.749 2 WARNING oslo_config.cfg [-] Deprecated: Option "tenant_name_discovery" from group "DEFAULT" is deprecated. Use option "identity_name_discovery" from group "DEFAULT".
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.750 2 DEBUG cotyledon.oslo_config_glue [-] identity_name_discovery        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.750 2 DEBUG cotyledon.oslo_config_glue [-] ignore_disabled_projects       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.750 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.750 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.750 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.750 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.750 2 DEBUG cotyledon.oslo_config_glue [-] log_color                      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.750 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.750 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.750 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.750 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.750 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.750 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.751 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.751 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.751 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.751 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.751 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.751 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.751 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.751 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.751 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.751 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.751 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.751 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.752 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.752 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.752 2 DEBUG cotyledon.oslo_config_glue [-] prometheus_listen_addresses    = ['127.0.0.1:9101'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.752 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.752 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.752 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.752 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.752 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.752 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.752 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.752 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.752 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.752 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.753 2 DEBUG cotyledon.oslo_config_glue [-] shell_completion               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.753 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.753 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.753 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.753 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.753 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.753 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.753 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.753 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.753 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.753 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.753 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.753 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.754 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.754 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.754 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.754 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.754 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.754 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.754 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.754 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.754 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.754 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.754 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.754 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.755 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.755 2 DEBUG cotyledon.oslo_config_glue [-] polling.enable_notifications   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.755 2 DEBUG cotyledon.oslo_config_glue [-] polling.enable_prometheus_exporter = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.755 2 DEBUG cotyledon.oslo_config_glue [-] polling.heartbeat_socket_dir   = /var/lib/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.755 2 DEBUG cotyledon.oslo_config_glue [-] polling.identity_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.755 2 DEBUG cotyledon.oslo_config_glue [-] polling.ignore_disabled_projects = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.755 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.755 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.755 2 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_listen_addresses = ['[::]:9101'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.755 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.755 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.755 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.756 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.756 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.756 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.756 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.756 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.756 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.756 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.756 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.756 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.756 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.756 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.756 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.757 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.757 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.757 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.757 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.757 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.757 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.757 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.757 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.757 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.757 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.757 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.757 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.757 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.758 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.758 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.758 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.758 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.758 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.758 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.758 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.758 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.758 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.758 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.758 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.758 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.759 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.759 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.759 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.759 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.759 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.759 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.759 2 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.759 2 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.759 2 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.759 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2828
Jan 31 01:26:17 np0005603500 python3.9[192862]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.778 14 INFO ceilometer.polling.manager [-] Starting heartbeat child service. Listening on /var/lib/ceilometer/ceilometer-compute.socket
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.778 14 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.778 14 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2804
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.779 14 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2805
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.779 14 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2806
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.779 14 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2807
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.779 14 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2809
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.779 14 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.779 14 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.779 14 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.779 14 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.779 14 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.780 14 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.780 14 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.780 14 DEBUG cotyledon.oslo_config_glue [-] enable_notifications           = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.780 14 DEBUG cotyledon.oslo_config_glue [-] enable_prometheus_exporter     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.780 14 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.780 14 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.780 14 DEBUG cotyledon.oslo_config_glue [-] heartbeat_socket_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.780 14 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.780 14 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.780 14 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.780 14 DEBUG cotyledon.oslo_config_glue [-] identity_name_discovery        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.780 14 DEBUG cotyledon.oslo_config_glue [-] ignore_disabled_projects       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.781 14 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.781 14 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.781 14 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.781 14 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.781 14 DEBUG cotyledon.oslo_config_glue [-] log_color                      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.781 14 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.781 14 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.781 14 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.781 14 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.781 14 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.781 14 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.781 14 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.781 14 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.782 14 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.782 14 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.782 14 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.782 14 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.782 14 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.782 14 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.782 14 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.782 14 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.782 14 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.782 14 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.782 14 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.782 14 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.783 14 DEBUG cotyledon.oslo_config_glue [-] prometheus_listen_addresses    = ['127.0.0.1:9101'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.783 14 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.783 14 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.783 14 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.783 14 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.783 14 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.783 14 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.783 14 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.783 14 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.783 14 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.783 14 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.783 14 DEBUG cotyledon.oslo_config_glue [-] shell_completion               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.784 14 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.784 14 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.784 14 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.784 14 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.784 14 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.784 14 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.784 14 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.784 14 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.784 14 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.784 14 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.784 14 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.785 14 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.785 14 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.785 14 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.785 14 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.785 14 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.785 14 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.785 14 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.785 14 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.785 14 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.785 14 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.785 14 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.785 14 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.786 14 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.786 14 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.786 14 DEBUG cotyledon.oslo_config_glue [-] polling.enable_notifications   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.786 14 DEBUG cotyledon.oslo_config_glue [-] polling.enable_prometheus_exporter = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.786 14 DEBUG cotyledon.oslo_config_glue [-] polling.heartbeat_socket_dir   = /var/lib/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.786 14 DEBUG cotyledon.oslo_config_glue [-] polling.identity_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.786 14 DEBUG cotyledon.oslo_config_glue [-] polling.ignore_disabled_projects = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.786 14 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.786 14 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.786 14 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_listen_addresses = ['[::]:9101'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.786 14 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.786 14 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.787 14 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.787 14 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.787 14 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.787 14 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.787 14 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.787 14 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.787 14 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.787 14 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.787 14 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.787 14 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.788 14 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.788 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.788 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.787 16 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.788 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.788 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.788 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.788 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.788 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.788 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.788 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.788 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.789 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.788 16 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.789 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.789 16 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.789 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.789 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.789 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.789 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.789 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.789 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.789 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.790 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.790 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.790 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.790 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.790 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.790 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.790 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.790 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.790 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.790 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.790 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.790 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.790 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.791 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.791 14 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.791 14 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.791 14 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.791 14 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2828
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.791 14 DEBUG cotyledon._service [-] Run service AgentHeartBeatManager(0) [14] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.792 14 DEBUG ceilometer.polling.manager [-] Started heartbeat child process. run /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:434
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.793 14 DEBUG ceilometer.polling.manager [-] Started heartbeat update thread _read_queue /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:437
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.794 14 DEBUG ceilometer.polling.manager [-] Started heartbeat reporting thread _report_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:442
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.821 16 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:95
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.948 16 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.949 16 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2804
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.949 16 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2805
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.949 16 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2806
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.950 16 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2807
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.950 16 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2809
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.950 16 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.950 16 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.950 16 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.950 16 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.951 16 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.951 16 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.951 16 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.951 16 DEBUG cotyledon.oslo_config_glue [-] enable_notifications           = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.951 16 DEBUG cotyledon.oslo_config_glue [-] enable_prometheus_exporter     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.951 16 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.951 16 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.951 16 DEBUG cotyledon.oslo_config_glue [-] heartbeat_socket_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.952 16 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.952 16 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.952 16 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.952 16 DEBUG cotyledon.oslo_config_glue [-] identity_name_discovery        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.952 16 DEBUG cotyledon.oslo_config_glue [-] ignore_disabled_projects       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.952 16 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.952 16 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.952 16 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.953 16 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.953 16 DEBUG cotyledon.oslo_config_glue [-] log_color                      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.953 16 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.953 16 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.953 16 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.953 16 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.953 16 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.953 16 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.953 16 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.953 16 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.953 16 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.953 16 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.954 16 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.954 16 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.954 16 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.954 16 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.954 16 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.954 16 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.954 16 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.954 16 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.954 16 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.954 16 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.954 16 DEBUG cotyledon.oslo_config_glue [-] prometheus_listen_addresses    = ['127.0.0.1:9101'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.955 16 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.955 16 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.955 16 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.955 16 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.955 16 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.955 16 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.955 16 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.955 16 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.955 16 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.955 16 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.955 16 DEBUG cotyledon.oslo_config_glue [-] shell_completion               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.955 16 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.956 16 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.956 16 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.956 16 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.956 16 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.956 16 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.956 16 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.956 16 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.956 16 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.956 16 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.956 16 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.956 16 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.956 16 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.956 16 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.957 16 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.957 16 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.957 16 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.957 16 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.957 16 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.957 16 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.957 16 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.957 16 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.957 16 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.957 16 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.957 16 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.957 16 DEBUG cotyledon.oslo_config_glue [-] polling.enable_notifications   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.958 16 DEBUG cotyledon.oslo_config_glue [-] polling.enable_prometheus_exporter = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.958 16 DEBUG cotyledon.oslo_config_glue [-] polling.heartbeat_socket_dir   = /var/lib/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.958 16 DEBUG cotyledon.oslo_config_glue [-] polling.identity_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.958 16 DEBUG cotyledon.oslo_config_glue [-] polling.ignore_disabled_projects = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.958 16 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.958 16 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.958 16 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_listen_addresses = ['[::]:9101'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.958 16 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.958 16 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.958 16 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.958 16 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.958 16 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.959 16 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.959 16 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.959 16 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.959 16 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.959 16 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.959 16 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.959 16 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.959 16 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.959 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.959 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.959 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.959 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.959 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.960 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.960 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.960 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.960 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.960 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.960 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.960 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.960 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.960 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.960 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.960 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.960 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.960 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.960 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.960 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.960 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.961 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.961 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.961 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.961 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.961 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.961 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.961 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.961 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.961 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.961 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.962 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.962 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.962 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.962 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.962 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.962 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.962 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.962 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.962 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.962 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.962 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.962 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.963 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.963 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.963 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.963 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.963 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.963 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.963 16 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.963 16 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.963 16 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.963 16 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2828
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.963 16 DEBUG cotyledon._service [-] Run service AgentManager(0) [16] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.967 16 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.979 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f199f43b3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.979 16 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:95
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.982 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.982 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f199f43b340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.982 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.982 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f199f44d220>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.982 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.982 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f199f43b0d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.982 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.982 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f199f44db50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.982 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.982 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f199f44dc10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.982 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.983 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f199f43bca0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.983 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.983 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f199f43baf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.983 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.983 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f199f44d3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.983 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.983 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f19a53f3b80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.983 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.983 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f199f451250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.983 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.983 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f199f43bbe0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.983 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f199f44d2e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f199f43bdf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f199f44d940>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f199f436bb0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f199f44d760>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f199f43b490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f199f43b550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f199f43b700>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f199f44d160>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f199f44d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f199f44dcd0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f199f44d4f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f199f44d040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f199f44d6a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:26:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:26:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:26:18 np0005603500 python3.9[193026]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:26:19 np0005603500 python3.9[193151]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840778.2214994-558-266224314342799/.source.yaml _original_basename=.f6jed7_k follow=False checksum=3843597fa0700400e76cc1b0aecf8bc3021a3fa2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:19 np0005603500 python3.9[193303]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:26:20 np0005603500 python3.9[193426]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840779.372082-573-94659472155129/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:26:21 np0005603500 python3.9[193578]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:21 np0005603500 python3.9[193730]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:26:22 np0005603500 python3.9[193882]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:26:22 np0005603500 python3.9[193960]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.8x1bs3if recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:23 np0005603500 python3.9[194110]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/node_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:24 np0005603500 nova_compute[182934]: 2026-01-31 06:26:24.668 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:26:24 np0005603500 nova_compute[182934]: 2026-01-31 06:26:24.670 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:26:24 np0005603500 nova_compute[182934]: 2026-01-31 06:26:24.670 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:26:24 np0005603500 nova_compute[182934]: 2026-01-31 06:26:24.670 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:26:24 np0005603500 nova_compute[182934]: 2026-01-31 06:26:24.671 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:26:24 np0005603500 nova_compute[182934]: 2026-01-31 06:26:24.671 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:26:24 np0005603500 nova_compute[182934]: 2026-01-31 06:26:24.671 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:26:25 np0005603500 nova_compute[182934]: 2026-01-31 06:26:25.189 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:26:25 np0005603500 nova_compute[182934]: 2026-01-31 06:26:25.190 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:26:25 np0005603500 nova_compute[182934]: 2026-01-31 06:26:25.190 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:26:25 np0005603500 python3.9[194533]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/node_exporter config_pattern=*.json debug=False
Jan 31 01:26:25 np0005603500 nova_compute[182934]: 2026-01-31 06:26:25.938 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:26:25 np0005603500 nova_compute[182934]: 2026-01-31 06:26:25.938 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:26:25 np0005603500 nova_compute[182934]: 2026-01-31 06:26:25.938 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:26:25 np0005603500 nova_compute[182934]: 2026-01-31 06:26:25.938 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:26:26 np0005603500 nova_compute[182934]: 2026-01-31 06:26:26.090 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:26:26 np0005603500 nova_compute[182934]: 2026-01-31 06:26:26.091 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6025MB free_disk=73.43208694458008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:26:26 np0005603500 nova_compute[182934]: 2026-01-31 06:26:26.091 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:26:26 np0005603500 nova_compute[182934]: 2026-01-31 06:26:26.092 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:26:26 np0005603500 python3.9[194685]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 31 01:26:27 np0005603500 python3[194837]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/node_exporter config_id=node_exporter config_overrides={} config_patterns=*.json containers=['node_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 31 01:26:27 np0005603500 nova_compute[182934]: 2026-01-31 06:26:27.247 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:26:27 np0005603500 nova_compute[182934]: 2026-01-31 06:26:27.248 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:26:27 np0005603500 podman[194874]: 2026-01-31 06:26:27.263392941 +0000 UTC m=+0.042803798 container create 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 31 01:26:27 np0005603500 podman[194874]: 2026-01-31 06:26:27.238087822 +0000 UTC m=+0.017498699 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Jan 31 01:26:27 np0005603500 python3[194837]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck node_exporter --label config_id=node_exporter --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Jan 31 01:26:27 np0005603500 nova_compute[182934]: 2026-01-31 06:26:27.278 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:26:27 np0005603500 nova_compute[182934]: 2026-01-31 06:26:27.807 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:26:27 np0005603500 nova_compute[182934]: 2026-01-31 06:26:27.809 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:26:27 np0005603500 nova_compute[182934]: 2026-01-31 06:26:27.809 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.717s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:26:27 np0005603500 nova_compute[182934]: 2026-01-31 06:26:27.809 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:26:27 np0005603500 python3.9[195064]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:26:28 np0005603500 python3.9[195218]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:29 np0005603500 python3.9[195294]: ansible-stat Invoked with path=/etc/systemd/system/edpm_node_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:26:29 np0005603500 podman[195417]: 2026-01-31 06:26:29.610792224 +0000 UTC m=+0.090136260 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:26:29 np0005603500 python3.9[195462]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769840789.1502252-685-237225704312108/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:30 np0005603500 python3.9[195540]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 01:26:30 np0005603500 systemd[1]: Reloading.
Jan 31 01:26:30 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:26:30 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:26:31 np0005603500 python3.9[195653]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:26:31 np0005603500 systemd[1]: Reloading.
Jan 31 01:26:31 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:26:31 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:26:31 np0005603500 systemd[1]: Starting node_exporter container...
Jan 31 01:26:31 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:26:31 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd7acd57426a79f0967caa065de8a27ec24c942f19ded8c529ad3e0ca1db26d8/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 31 01:26:31 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd7acd57426a79f0967caa065de8a27ec24c942f19ded8c529ad3e0ca1db26d8/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 31 01:26:31 np0005603500 systemd[1]: Started /usr/bin/podman healthcheck run 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb.
Jan 31 01:26:31 np0005603500 podman[195693]: 2026-01-31 06:26:31.706263398 +0000 UTC m=+0.151262684 container init 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.721Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.721Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.721Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.722Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.722Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.722Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.722Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.722Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.722Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.722Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.722Z caller=node_exporter.go:117 level=info collector=arp
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.722Z caller=node_exporter.go:117 level=info collector=bcache
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.722Z caller=node_exporter.go:117 level=info collector=bonding
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.722Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.722Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=cpu
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=edac
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=filefd
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=netclass
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=netdev
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=netstat
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=nfs
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=nvme
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=softnet
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=systemd
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=xfs
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=node_exporter.go:117 level=info collector=zfs
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.723Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Jan 31 01:26:31 np0005603500 node_exporter[195709]: ts=2026-01-31T06:26:31.724Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Jan 31 01:26:31 np0005603500 podman[195693]: 2026-01-31 06:26:31.744775589 +0000 UTC m=+0.189774795 container start 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 31 01:26:31 np0005603500 podman[195693]: node_exporter
Jan 31 01:26:31 np0005603500 systemd[1]: Started node_exporter container.
Jan 31 01:26:31 np0005603500 podman[195718]: 2026-01-31 06:26:31.812099649 +0000 UTC m=+0.059435899 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 31 01:26:32 np0005603500 python3.9[195892]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 31 01:26:33 np0005603500 python3.9[196044]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:26:33 np0005603500 python3.9[196169]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840792.914676-730-184883379303183/.source.yaml _original_basename=.jy1krts4 follow=False checksum=59de75cdbd004c866dec393250f3f6aabe194aaa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:34 np0005603500 python3.9[196321]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:26:34 np0005603500 python3.9[196444]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840793.9507523-745-248392836844823/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:26:35 np0005603500 python3.9[196596]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:36 np0005603500 python3.9[196748]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:26:37 np0005603500 python3.9[196900]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:26:37 np0005603500 python3.9[196978]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=._gw5b96b recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:38 np0005603500 python3.9[197128]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:40 np0005603500 python3.9[197551]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Jan 31 01:26:41 np0005603500 python3.9[197703]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 31 01:26:41 np0005603500 podman[197827]: 2026-01-31 06:26:41.616221942 +0000 UTC m=+0.076955079 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 01:26:41 np0005603500 python3[197873]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 31 01:26:43 np0005603500 podman[197894]: 2026-01-31 06:26:43.37998888 +0000 UTC m=+1.476620564 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Jan 31 01:26:43 np0005603500 podman[197991]: 2026-01-31 06:26:43.515234581 +0000 UTC m=+0.060814153 container create d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=podman_exporter)
Jan 31 01:26:43 np0005603500 podman[197991]: 2026-01-31 06:26:43.473063864 +0000 UTC m=+0.018643456 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Jan 31 01:26:43 np0005603500 python3[197873]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Jan 31 01:26:44 np0005603500 python3.9[198181]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:26:44 np0005603500 python3.9[198335]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:45 np0005603500 python3.9[198411]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:26:46 np0005603500 python3.9[198562]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769840805.4910588-857-143610661608244/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:46 np0005603500 python3.9[198638]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 01:26:46 np0005603500 systemd[1]: Reloading.
Jan 31 01:26:46 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:26:46 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:26:47 np0005603500 podman[198675]: 2026-01-31 06:26:47.178304673 +0000 UTC m=+0.089375048 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 01:26:47 np0005603500 systemd[1]: 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d-47188e907a613813.service: Main process exited, code=exited, status=1/FAILURE
Jan 31 01:26:47 np0005603500 systemd[1]: 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d-47188e907a613813.service: Failed with result 'exit-code'.
Jan 31 01:26:47 np0005603500 python3.9[198770]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:26:47 np0005603500 systemd[1]: Reloading.
Jan 31 01:26:47 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:26:47 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:26:47 np0005603500 systemd[1]: Starting podman_exporter container...
Jan 31 01:26:48 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:26:48 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a6f98e2e2eb3c53974864d2bb3033480584e26b61943d1e8fbe2cde8e08f078/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 31 01:26:48 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a6f98e2e2eb3c53974864d2bb3033480584e26b61943d1e8fbe2cde8e08f078/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 31 01:26:48 np0005603500 systemd[1]: Started /usr/bin/podman healthcheck run d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a.
Jan 31 01:26:48 np0005603500 podman[198810]: 2026-01-31 06:26:48.60025612 +0000 UTC m=+0.634958743 container init d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:26:48 np0005603500 podman_exporter[198826]: ts=2026-01-31T06:26:48.629Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Jan 31 01:26:48 np0005603500 podman_exporter[198826]: ts=2026-01-31T06:26:48.629Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Jan 31 01:26:48 np0005603500 podman_exporter[198826]: ts=2026-01-31T06:26:48.630Z caller=handler.go:94 level=info msg="enabled collectors"
Jan 31 01:26:48 np0005603500 podman_exporter[198826]: ts=2026-01-31T06:26:48.630Z caller=handler.go:105 level=info collector=container
Jan 31 01:26:48 np0005603500 podman[198810]: 2026-01-31 06:26:48.635871436 +0000 UTC m=+0.670574079 container start d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:26:48 np0005603500 systemd[1]: Starting Podman API Service...
Jan 31 01:26:48 np0005603500 systemd[1]: Started Podman API Service.
Jan 31 01:26:48 np0005603500 podman[198810]: podman_exporter
Jan 31 01:26:48 np0005603500 systemd[1]: Started podman_exporter container.
Jan 31 01:26:48 np0005603500 podman[198842]: time="2026-01-31T06:26:48Z" level=info msg="/usr/bin/podman filtering at log level info"
Jan 31 01:26:48 np0005603500 podman[198842]: time="2026-01-31T06:26:48Z" level=info msg="Setting parallel job count to 25"
Jan 31 01:26:48 np0005603500 podman[198842]: time="2026-01-31T06:26:48Z" level=info msg="Using sqlite as database backend"
Jan 31 01:26:48 np0005603500 podman[198842]: time="2026-01-31T06:26:48Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Jan 31 01:26:48 np0005603500 podman[198842]: time="2026-01-31T06:26:48Z" level=info msg="Using systemd socket activation to determine API endpoint"
Jan 31 01:26:48 np0005603500 podman[198842]: time="2026-01-31T06:26:48Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Jan 31 01:26:48 np0005603500 podman[198842]: @ - - [31/Jan/2026:06:26:48 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Jan 31 01:26:48 np0005603500 podman[198842]: time="2026-01-31T06:26:48Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 31 01:26:48 np0005603500 podman[198835]: 2026-01-31 06:26:48.701503048 +0000 UTC m=+0.059000120 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:26:48 np0005603500 systemd[1]: d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a-648795427e5940.service: Main process exited, code=exited, status=1/FAILURE
Jan 31 01:26:48 np0005603500 systemd[1]: d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a-648795427e5940.service: Failed with result 'exit-code'.
Jan 31 01:26:48 np0005603500 podman[198842]: @ - - [31/Jan/2026:06:26:48 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 18593 "" "Go-http-client/1.1"
Jan 31 01:26:48 np0005603500 podman_exporter[198826]: ts=2026-01-31T06:26:48.706Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Jan 31 01:26:48 np0005603500 podman_exporter[198826]: ts=2026-01-31T06:26:48.707Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Jan 31 01:26:48 np0005603500 podman_exporter[198826]: ts=2026-01-31T06:26:48.707Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Jan 31 01:26:49 np0005603500 python3.9[199021]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 31 01:26:50 np0005603500 python3.9[199173]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:26:50 np0005603500 python3.9[199298]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840809.8342743-902-274228039095582/.source.yaml _original_basename=.z89mh3up follow=False checksum=52655dbb4749492facb67835359a3b527ad6e5d7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:51 np0005603500 python3.9[199450]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:26:51 np0005603500 python3.9[199573]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769840811.043918-917-252572624039243/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:26:52 np0005603500 python3.9[199725]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:53 np0005603500 python3.9[199877]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 01:26:54 np0005603500 python3.9[200029]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:26:54 np0005603500 python3.9[200107]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.gsltkfnm recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:55 np0005603500 python3.9[200257]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:26:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:26:55.276 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:26:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:26:55.277 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:26:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:26:55.277 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:26:57 np0005603500 python3.9[200681]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Jan 31 01:26:57 np0005603500 python3.9[200833]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 31 01:26:58 np0005603500 python3[200985]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 31 01:27:00 np0005603500 podman[201012]: 2026-01-31 06:27:00.149518016 +0000 UTC m=+0.055471385 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 01:27:02 np0005603500 podman[201063]: 2026-01-31 06:27:02.312771457 +0000 UTC m=+0.202979693 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:27:02 np0005603500 podman[200998]: 2026-01-31 06:27:02.887754949 +0000 UTC m=+4.036394427 image pull 2679468753c61ac8a0e14904b347eedc3a9181a15e3bff0987683c22e1f9cae7 quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d
Jan 31 01:27:03 np0005603500 podman[201141]: 2026-01-31 06:27:02.994244816 +0000 UTC m=+0.021764322 image pull 2679468753c61ac8a0e14904b347eedc3a9181a15e3bff0987683c22e1f9cae7 quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d
Jan 31 01:27:03 np0005603500 podman[201141]: 2026-01-31 06:27:03.102400967 +0000 UTC m=+0.129920483 container create bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, version=9.7, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, build-date=2026-01-22T05:09:47Z, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, distribution-scope=public)
Jan 31 01:27:03 np0005603500 python3[200985]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d
Jan 31 01:27:03 np0005603500 python3.9[201331]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:27:04 np0005603500 python3.9[201485]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:05 np0005603500 python3.9[201561]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:27:05 np0005603500 python3.9[201712]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769840825.1249166-1029-244420547006678/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:06 np0005603500 python3.9[201788]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 01:27:06 np0005603500 systemd[1]: Reloading.
Jan 31 01:27:06 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:27:06 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:27:07 np0005603500 python3.9[201899]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 01:27:07 np0005603500 systemd[1]: Reloading.
Jan 31 01:27:07 np0005603500 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:27:07 np0005603500 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 01:27:07 np0005603500 systemd[1]: Starting openstack_network_exporter container...
Jan 31 01:27:07 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:27:07 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/779a6d3cb2890f98683bf025fcb85504ecb084cdee231d774448b1ec7da07124/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 31 01:27:07 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/779a6d3cb2890f98683bf025fcb85504ecb084cdee231d774448b1ec7da07124/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 31 01:27:07 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/779a6d3cb2890f98683bf025fcb85504ecb084cdee231d774448b1ec7da07124/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 31 01:27:07 np0005603500 systemd[1]: Started /usr/bin/podman healthcheck run bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b.
Jan 31 01:27:07 np0005603500 podman[201939]: 2026-01-31 06:27:07.56483291 +0000 UTC m=+0.134818780 container init bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, architecture=x86_64, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, distribution-scope=public, maintainer=Red Hat, Inc., version=9.7, container_name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 31 01:27:07 np0005603500 openstack_network_exporter[201955]: INFO    06:27:07 main.go:48: registering *bridge.Collector
Jan 31 01:27:07 np0005603500 openstack_network_exporter[201955]: INFO    06:27:07 main.go:48: registering *coverage.Collector
Jan 31 01:27:07 np0005603500 openstack_network_exporter[201955]: INFO    06:27:07 main.go:48: registering *datapath.Collector
Jan 31 01:27:07 np0005603500 openstack_network_exporter[201955]: INFO    06:27:07 main.go:48: registering *iface.Collector
Jan 31 01:27:07 np0005603500 openstack_network_exporter[201955]: INFO    06:27:07 main.go:48: registering *memory.Collector
Jan 31 01:27:07 np0005603500 openstack_network_exporter[201955]: INFO    06:27:07 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Jan 31 01:27:07 np0005603500 openstack_network_exporter[201955]: INFO    06:27:07 main.go:48: registering *ovn.Collector
Jan 31 01:27:07 np0005603500 openstack_network_exporter[201955]: INFO    06:27:07 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Jan 31 01:27:07 np0005603500 openstack_network_exporter[201955]: INFO    06:27:07 main.go:48: registering *pmd_perf.Collector
Jan 31 01:27:07 np0005603500 openstack_network_exporter[201955]: INFO    06:27:07 main.go:48: registering *pmd_rxq.Collector
Jan 31 01:27:07 np0005603500 openstack_network_exporter[201955]: INFO    06:27:07 main.go:48: registering *vswitch.Collector
Jan 31 01:27:07 np0005603500 openstack_network_exporter[201955]: NOTICE  06:27:07 main.go:76: listening on https://:9105/metrics
Jan 31 01:27:07 np0005603500 podman[201939]: 2026-01-31 06:27:07.590137154 +0000 UTC m=+0.160123064 container start bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, io.openshift.expose-services=, release=1769056855, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter)
Jan 31 01:27:07 np0005603500 podman[201939]: openstack_network_exporter
Jan 31 01:27:07 np0005603500 systemd[1]: Started openstack_network_exporter container.
Jan 31 01:27:07 np0005603500 podman[201965]: 2026-01-31 06:27:07.675311834 +0000 UTC m=+0.076834573 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1769056855, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, architecture=x86_64, build-date=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 31 01:27:08 np0005603500 python3.9[202136]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 31 01:27:09 np0005603500 python3.9[202288]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:27:09 np0005603500 python3.9[202413]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840828.8107364-1074-270444999047594/.source.yaml _original_basename=.6dbnyyq5 follow=False checksum=9e304fc0f5ebb6c1b90ab6943b2d9026dc93b5a9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:10 np0005603500 python3.9[202565]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 31 01:27:11 np0005603500 auditd[701]: Audit daemon rotating log files
Jan 31 01:27:11 np0005603500 python3.9[202717]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Jan 31 01:27:11 np0005603500 podman[202854]: 2026-01-31 06:27:11.99328048 +0000 UTC m=+0.091845957 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:27:12 np0005603500 python3.9[202902]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 31 01:27:12 np0005603500 systemd[1]: Started libpod-conmon-072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d.scope.
Jan 31 01:27:12 np0005603500 podman[202910]: 2026-01-31 06:27:12.253684469 +0000 UTC m=+0.073361352 container exec 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:27:12 np0005603500 podman[202910]: 2026-01-31 06:27:12.289073318 +0000 UTC m=+0.108750201 container exec_died 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, org.label-schema.build-date=20260127, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:27:12 np0005603500 systemd[1]: libpod-conmon-072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d.scope: Deactivated successfully.
Jan 31 01:27:12 np0005603500 python3.9[203093]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 31 01:27:12 np0005603500 systemd[1]: Started libpod-conmon-072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d.scope.
Jan 31 01:27:12 np0005603500 podman[203094]: 2026-01-31 06:27:12.974500523 +0000 UTC m=+0.063499054 container exec 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 01:27:13 np0005603500 podman[203094]: 2026-01-31 06:27:13.005059177 +0000 UTC m=+0.094057708 container exec_died 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 01:27:13 np0005603500 systemd[1]: libpod-conmon-072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d.scope: Deactivated successfully.
Jan 31 01:27:13 np0005603500 python3.9[203278]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:14 np0005603500 python3.9[203430]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Jan 31 01:27:14 np0005603500 python3.9[203595]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 31 01:27:15 np0005603500 systemd[1]: Started libpod-conmon-b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227.scope.
Jan 31 01:27:15 np0005603500 podman[203596]: 2026-01-31 06:27:15.021971228 +0000 UTC m=+0.075458379 container exec b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:27:15 np0005603500 podman[203596]: 2026-01-31 06:27:15.052003334 +0000 UTC m=+0.105490505 container exec_died b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true)
Jan 31 01:27:15 np0005603500 systemd[1]: libpod-conmon-b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227.scope: Deactivated successfully.
Jan 31 01:27:15 np0005603500 python3.9[203778]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 31 01:27:15 np0005603500 systemd[1]: Started libpod-conmon-b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227.scope.
Jan 31 01:27:15 np0005603500 podman[203779]: 2026-01-31 06:27:15.762150956 +0000 UTC m=+0.073951021 container exec b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 01:27:15 np0005603500 podman[203779]: 2026-01-31 06:27:15.792552954 +0000 UTC m=+0.104353019 container exec_died b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 01:27:15 np0005603500 systemd[1]: libpod-conmon-b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227.scope: Deactivated successfully.
Jan 31 01:27:16 np0005603500 python3.9[203963]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:16 np0005603500 python3.9[204115]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Jan 31 01:27:17 np0005603500 podman[204252]: 2026-01-31 06:27:17.434546531 +0000 UTC m=+0.056584822 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Jan 31 01:27:17 np0005603500 systemd[1]: 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d-47188e907a613813.service: Main process exited, code=exited, status=1/FAILURE
Jan 31 01:27:17 np0005603500 systemd[1]: 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d-47188e907a613813.service: Failed with result 'exit-code'.
Jan 31 01:27:17 np0005603500 python3.9[204298]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 31 01:27:17 np0005603500 systemd[1]: Started libpod-conmon-0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d.scope.
Jan 31 01:27:17 np0005603500 podman[204300]: 2026-01-31 06:27:17.779384527 +0000 UTC m=+0.150735821 container exec 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 31 01:27:17 np0005603500 podman[204320]: 2026-01-31 06:27:17.969850116 +0000 UTC m=+0.181129269 container exec_died 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 01:27:17 np0005603500 podman[204300]: 2026-01-31 06:27:17.981092277 +0000 UTC m=+0.352443571 container exec_died 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 31 01:27:17 np0005603500 systemd[1]: libpod-conmon-0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d.scope: Deactivated successfully.
Jan 31 01:27:18 np0005603500 python3.9[204484]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 31 01:27:18 np0005603500 systemd[1]: Started libpod-conmon-0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d.scope.
Jan 31 01:27:18 np0005603500 podman[204485]: 2026-01-31 06:27:18.687887982 +0000 UTC m=+0.075983257 container exec 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 31 01:27:18 np0005603500 podman[204505]: 2026-01-31 06:27:18.747731407 +0000 UTC m=+0.051311032 container exec_died 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 31 01:27:18 np0005603500 podman[204485]: 2026-01-31 06:27:18.753156631 +0000 UTC m=+0.141251916 container exec_died 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute)
Jan 31 01:27:18 np0005603500 systemd[1]: libpod-conmon-0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d.scope: Deactivated successfully.
Jan 31 01:27:18 np0005603500 podman[204518]: 2026-01-31 06:27:18.847660923 +0000 UTC m=+0.057937806 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 31 01:27:19 np0005603500 python3.9[204694]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:20 np0005603500 python3.9[204846]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Jan 31 01:27:20 np0005603500 python3.9[205012]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 31 01:27:20 np0005603500 systemd[1]: Started libpod-conmon-043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb.scope.
Jan 31 01:27:20 np0005603500 podman[205013]: 2026-01-31 06:27:20.870940218 +0000 UTC m=+0.161850628 container exec 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:27:20 np0005603500 podman[205013]: 2026-01-31 06:27:20.903307789 +0000 UTC m=+0.194218199 container exec_died 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 31 01:27:21 np0005603500 systemd[1]: libpod-conmon-043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb.scope: Deactivated successfully.
Jan 31 01:27:21 np0005603500 python3.9[205198]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 31 01:27:21 np0005603500 systemd[1]: Started libpod-conmon-043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb.scope.
Jan 31 01:27:21 np0005603500 podman[205199]: 2026-01-31 06:27:21.849681142 +0000 UTC m=+0.086101161 container exec 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:27:21 np0005603500 podman[205219]: 2026-01-31 06:27:21.924899993 +0000 UTC m=+0.061233082 container exec_died 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:27:21 np0005603500 podman[205199]: 2026-01-31 06:27:21.931116023 +0000 UTC m=+0.167536052 container exec_died 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 31 01:27:21 np0005603500 systemd[1]: libpod-conmon-043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb.scope: Deactivated successfully.
Jan 31 01:27:22 np0005603500 python3.9[205383]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:23 np0005603500 python3.9[205535]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Jan 31 01:27:24 np0005603500 python3.9[205700]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 31 01:27:24 np0005603500 systemd[1]: Started libpod-conmon-d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a.scope.
Jan 31 01:27:24 np0005603500 podman[205701]: 2026-01-31 06:27:24.167142484 +0000 UTC m=+0.076717369 container exec d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 31 01:27:24 np0005603500 podman[205721]: 2026-01-31 06:27:24.232704664 +0000 UTC m=+0.055390143 container exec_died d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:27:24 np0005603500 podman[205701]: 2026-01-31 06:27:24.238828032 +0000 UTC m=+0.148402897 container exec_died d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 31 01:27:24 np0005603500 systemd[1]: libpod-conmon-d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a.scope: Deactivated successfully.
Jan 31 01:27:24 np0005603500 nova_compute[182934]: 2026-01-31 06:27:24.282 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:27:24 np0005603500 nova_compute[182934]: 2026-01-31 06:27:24.284 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:27:24 np0005603500 python3.9[205883]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 31 01:27:25 np0005603500 systemd[1]: Started libpod-conmon-d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a.scope.
Jan 31 01:27:25 np0005603500 podman[205884]: 2026-01-31 06:27:25.019340117 +0000 UTC m=+0.059496796 container exec d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 31 01:27:25 np0005603500 podman[205903]: 2026-01-31 06:27:25.077770087 +0000 UTC m=+0.051131596 container exec_died d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 31 01:27:25 np0005603500 podman[205884]: 2026-01-31 06:27:25.082653794 +0000 UTC m=+0.122810433 container exec_died d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:27:25 np0005603500 systemd[1]: libpod-conmon-d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a.scope: Deactivated successfully.
Jan 31 01:27:25 np0005603500 nova_compute[182934]: 2026-01-31 06:27:25.297 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:27:25 np0005603500 nova_compute[182934]: 2026-01-31 06:27:25.297 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:27:25 np0005603500 nova_compute[182934]: 2026-01-31 06:27:25.297 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:27:25 np0005603500 nova_compute[182934]: 2026-01-31 06:27:25.297 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:27:25 np0005603500 nova_compute[182934]: 2026-01-31 06:27:25.298 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:27:25 np0005603500 nova_compute[182934]: 2026-01-31 06:27:25.298 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:27:25 np0005603500 nova_compute[182934]: 2026-01-31 06:27:25.298 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:27:25 np0005603500 nova_compute[182934]: 2026-01-31 06:27:25.298 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:27:25 np0005603500 python3.9[206067]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:26 np0005603500 python3.9[206219]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Jan 31 01:27:26 np0005603500 nova_compute[182934]: 2026-01-31 06:27:26.434 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:27:26 np0005603500 nova_compute[182934]: 2026-01-31 06:27:26.435 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:27:26 np0005603500 nova_compute[182934]: 2026-01-31 06:27:26.435 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:27:26 np0005603500 nova_compute[182934]: 2026-01-31 06:27:26.435 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:27:26 np0005603500 nova_compute[182934]: 2026-01-31 06:27:26.575 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:27:26 np0005603500 nova_compute[182934]: 2026-01-31 06:27:26.576 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5878MB free_disk=73.21749114990234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:27:26 np0005603500 nova_compute[182934]: 2026-01-31 06:27:26.577 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:27:26 np0005603500 nova_compute[182934]: 2026-01-31 06:27:26.577 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:27:26 np0005603500 python3.9[206382]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 31 01:27:27 np0005603500 systemd[1]: Started libpod-conmon-bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b.scope.
Jan 31 01:27:27 np0005603500 podman[206383]: 2026-01-31 06:27:27.068293909 +0000 UTC m=+0.074451226 container exec bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, org.opencontainers.image.created=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 31 01:27:27 np0005603500 podman[206402]: 2026-01-31 06:27:27.131782172 +0000 UTC m=+0.052414568 container exec_died bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1769056855, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 31 01:27:27 np0005603500 podman[206383]: 2026-01-31 06:27:27.138264031 +0000 UTC m=+0.144421318 container exec_died bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, container_name=openstack_network_exporter, vcs-type=git, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Jan 31 01:27:27 np0005603500 systemd[1]: libpod-conmon-bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b.scope: Deactivated successfully.
Jan 31 01:27:27 np0005603500 python3.9[206565]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 31 01:27:27 np0005603500 nova_compute[182934]: 2026-01-31 06:27:27.716 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:27:27 np0005603500 nova_compute[182934]: 2026-01-31 06:27:27.716 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:27:27 np0005603500 nova_compute[182934]: 2026-01-31 06:27:27.740 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:27:27 np0005603500 systemd[1]: Started libpod-conmon-bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b.scope.
Jan 31 01:27:27 np0005603500 podman[206566]: 2026-01-31 06:27:27.780871439 +0000 UTC m=+0.071640827 container exec bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.7, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, release=1769056855)
Jan 31 01:27:27 np0005603500 podman[206566]: 2026-01-31 06:27:27.814117049 +0000 UTC m=+0.104886427 container exec_died bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, architecture=x86_64, build-date=2026-01-22T05:09:47Z, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1769056855, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter)
Jan 31 01:27:27 np0005603500 systemd[1]: libpod-conmon-bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b.scope: Deactivated successfully.
Jan 31 01:27:28 np0005603500 nova_compute[182934]: 2026-01-31 06:27:28.259 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:27:28 np0005603500 nova_compute[182934]: 2026-01-31 06:27:28.260 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:27:28 np0005603500 nova_compute[182934]: 2026-01-31 06:27:28.260 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:27:28 np0005603500 python3.9[206750]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:29 np0005603500 python3.9[206902]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:29 np0005603500 python3.9[207054]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:27:30 np0005603500 python3.9[207177]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769840849.2642663-1305-188473455190614/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:30 np0005603500 podman[207301]: 2026-01-31 06:27:30.986359536 +0000 UTC m=+0.070714116 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:27:31 np0005603500 python3.9[207344]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:32 np0005603500 python3.9[207500]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:27:32 np0005603500 python3.9[207578]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:32 np0005603500 podman[207702]: 2026-01-31 06:27:32.966626978 +0000 UTC m=+0.052493830 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:27:33 np0005603500 python3.9[207746]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:27:33 np0005603500 python3.9[207832]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.e1yqx0iy recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:34 np0005603500 python3.9[207984]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:27:37 np0005603500 python3.9[208063]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:37 np0005603500 python3.9[208215]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:27:38 np0005603500 podman[208293]: 2026-01-31 06:27:38.157509132 +0000 UTC m=+0.080149080 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, distribution-scope=public, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1769056855, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 31 01:27:38 np0005603500 python3[208389]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 31 01:27:39 np0005603500 python3.9[208541]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:27:39 np0005603500 python3.9[208619]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:40 np0005603500 python3.9[208771]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:27:40 np0005603500 python3.9[208849]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:41 np0005603500 python3.9[209001]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:27:41 np0005603500 python3.9[209079]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:42 np0005603500 podman[209231]: 2026-01-31 06:27:42.13527958 +0000 UTC m=+0.082328800 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 01:27:42 np0005603500 python3.9[209232]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:27:42 np0005603500 python3.9[209336]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:43 np0005603500 python3.9[209488]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 01:27:43 np0005603500 python3.9[209613]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769840862.881449-1430-242135556820588/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:44 np0005603500 python3.9[209765]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:45 np0005603500 python3.9[209917]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:27:46 np0005603500 python3.9[210072]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:46 np0005603500 python3.9[210224]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:27:47 np0005603500 python3.9[210377]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 01:27:47 np0005603500 podman[210503]: 2026-01-31 06:27:47.828177449 +0000 UTC m=+0.177812764 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute)
Jan 31 01:27:47 np0005603500 python3.9[210544]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:27:48 np0005603500 python3.9[210704]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:27:49 np0005603500 systemd[1]: session-25.scope: Deactivated successfully.
Jan 31 01:27:49 np0005603500 systemd[1]: session-25.scope: Consumed 1min 36.459s CPU time.
Jan 31 01:27:49 np0005603500 systemd-logind[821]: Session 25 logged out. Waiting for processes to exit.
Jan 31 01:27:49 np0005603500 systemd-logind[821]: Removed session 25.
Jan 31 01:27:49 np0005603500 podman[210729]: 2026-01-31 06:27:49.123290053 +0000 UTC m=+0.044107270 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 31 01:27:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:27:55.293 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:27:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:27:55.293 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:27:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:27:55.294 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:28:01 np0005603500 podman[210756]: 2026-01-31 06:28:01.126477233 +0000 UTC m=+0.050419645 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.schema-version=1.0)
Jan 31 01:28:03 np0005603500 podman[210777]: 2026-01-31 06:28:03.137807031 +0000 UTC m=+0.059182536 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 31 01:28:09 np0005603500 podman[210801]: 2026-01-31 06:28:09.125914004 +0000 UTC m=+0.047463190 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2026-01-22T05:09:47Z, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Jan 31 01:28:13 np0005603500 podman[210822]: 2026-01-31 06:28:13.157380981 +0000 UTC m=+0.077083109 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.980 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f199f43b3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.983 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.983 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f199f44d2e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.983 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.983 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f199f44d220>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.983 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f199f44d940>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f199f44d160>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f199f436bb0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f19a53f3b80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f199f44d760>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f199f43bca0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f199f44d3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f199f44dcd0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f199f44d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f199f451250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f199f43bdf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f199f44dc10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f199f44d040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f199f43baf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f199f43b490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f199f43b340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f199f44db50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f199f43b700>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f199f44d6a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f199f44d4f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f199f43b550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f199f43bbe0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f199f43b0d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:28:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:28:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:28:18 np0005603500 podman[210850]: 2026-01-31 06:28:18.131433487 +0000 UTC m=+0.051730127 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 31 01:28:20 np0005603500 podman[210871]: 2026-01-31 06:28:20.183437937 +0000 UTC m=+0.097311627 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 31 01:28:28 np0005603500 nova_compute[182934]: 2026-01-31 06:28:28.262 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:28:28 np0005603500 nova_compute[182934]: 2026-01-31 06:28:28.263 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:28:28 np0005603500 nova_compute[182934]: 2026-01-31 06:28:28.263 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:28:28 np0005603500 nova_compute[182934]: 2026-01-31 06:28:28.263 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:28:28 np0005603500 nova_compute[182934]: 2026-01-31 06:28:28.264 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:28:28 np0005603500 nova_compute[182934]: 2026-01-31 06:28:28.264 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:28:28 np0005603500 nova_compute[182934]: 2026-01-31 06:28:28.264 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:28:28 np0005603500 nova_compute[182934]: 2026-01-31 06:28:28.264 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:28:28 np0005603500 nova_compute[182934]: 2026-01-31 06:28:28.265 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:28:28 np0005603500 nova_compute[182934]: 2026-01-31 06:28:28.790 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:28:28 np0005603500 nova_compute[182934]: 2026-01-31 06:28:28.791 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:28:28 np0005603500 nova_compute[182934]: 2026-01-31 06:28:28.791 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:28:28 np0005603500 nova_compute[182934]: 2026-01-31 06:28:28.792 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:28:28 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:28.897 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:28:28 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:28.899 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:28:28 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:28.901 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:28:28 np0005603500 nova_compute[182934]: 2026-01-31 06:28:28.940 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:28:28 np0005603500 nova_compute[182934]: 2026-01-31 06:28:28.941 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5983MB free_disk=73.25014114379883GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:28:28 np0005603500 nova_compute[182934]: 2026-01-31 06:28:28.942 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:28:28 np0005603500 nova_compute[182934]: 2026-01-31 06:28:28.942 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:28:30 np0005603500 nova_compute[182934]: 2026-01-31 06:28:30.004 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:28:30 np0005603500 nova_compute[182934]: 2026-01-31 06:28:30.004 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:28:30 np0005603500 nova_compute[182934]: 2026-01-31 06:28:30.058 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:28:30 np0005603500 nova_compute[182934]: 2026-01-31 06:28:30.571 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:28:30 np0005603500 nova_compute[182934]: 2026-01-31 06:28:30.572 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:28:30 np0005603500 nova_compute[182934]: 2026-01-31 06:28:30.573 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:28:32 np0005603500 podman[210896]: 2026-01-31 06:28:32.128422264 +0000 UTC m=+0.044545868 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 01:28:34 np0005603500 podman[210917]: 2026-01-31 06:28:34.139477851 +0000 UTC m=+0.054183455 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 31 01:28:35 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:35.152 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:32:0c 192.168.122.171'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.171/24', 'neutron:device_id': 'ovnmeta-cbe36913-7304-4c39-83eb-fb9e46ffef26', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cbe36913-7304-4c39-83eb-fb9e46ffef26', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '258ad37c7a194c8cb9fd805ff19f8fe0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0c3ba82-8da2-460e-8453-866e77dec134, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1d3d0d94-dcd7-4292-870a-4620ea9516eb) old=Port_Binding(mac=['fa:16:3e:d5:32:0c'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-cbe36913-7304-4c39-83eb-fb9e46ffef26', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cbe36913-7304-4c39-83eb-fb9e46ffef26', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '258ad37c7a194c8cb9fd805ff19f8fe0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:28:35 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:35.154 104644 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 1d3d0d94-dcd7-4292-870a-4620ea9516eb in datapath cbe36913-7304-4c39-83eb-fb9e46ffef26 updated
Jan 31 01:28:35 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:35.156 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cbe36913-7304-4c39-83eb-fb9e46ffef26, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:28:35 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:35.158 104644 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpncu0drmd/privsep.sock']
Jan 31 01:28:35 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:35.860 104644 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 31 01:28:35 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:35.861 104644 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpncu0drmd/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:366
Jan 31 01:28:35 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:35.733 210946 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 31 01:28:35 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:35.736 210946 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 31 01:28:35 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:35.738 210946 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Jan 31 01:28:35 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:35.738 210946 INFO oslo.privsep.daemon [-] privsep daemon running as pid 210946
Jan 31 01:28:35 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:35.864 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[b6797025-ed24-417f-9a7a-992899b69253]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:28:36 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:36.482 210946 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:28:36 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:36.482 210946 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:28:36 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:36.482 210946 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:28:36 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:36.869 210946 INFO oslo_service.backend [-] Loading backend: eventlet
Jan 31 01:28:36 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:36.873 210946 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Jan 31 01:28:36 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:36.904 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[0ac88f80-966a-4f44-b3df-c0e8cafcdd30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:28:40 np0005603500 podman[210953]: 2026-01-31 06:28:40.138535205 +0000 UTC m=+0.057578535 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter)
Jan 31 01:28:44 np0005603500 podman[210974]: 2026-01-31 06:28:44.128296596 +0000 UTC m=+0.050607990 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 01:28:49 np0005603500 podman[211000]: 2026-01-31 06:28:49.139621776 +0000 UTC m=+0.061094657 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 31 01:28:51 np0005603500 podman[211020]: 2026-01-31 06:28:51.145088646 +0000 UTC m=+0.058874586 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 31 01:28:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:55.354 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:28:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:55.355 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:28:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:28:55.355 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:29:03 np0005603500 podman[211046]: 2026-01-31 06:29:03.127130853 +0000 UTC m=+0.049826550 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 01:29:05 np0005603500 podman[211066]: 2026-01-31 06:29:05.169642021 +0000 UTC m=+0.083654781 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 31 01:29:11 np0005603500 podman[211091]: 2026-01-31 06:29:11.118624082 +0000 UTC m=+0.042235784 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1769056855, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7)
Jan 31 01:29:15 np0005603500 podman[211112]: 2026-01-31 06:29:15.137325662 +0000 UTC m=+0.061344085 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, managed_by=edpm_ansible)
Jan 31 01:29:20 np0005603500 podman[211139]: 2026-01-31 06:29:20.134144789 +0000 UTC m=+0.056187467 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 01:29:22 np0005603500 podman[211159]: 2026-01-31 06:29:22.134189237 +0000 UTC m=+0.053430158 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 31 01:29:27 np0005603500 nova_compute[182934]: 2026-01-31 06:29:27.453 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:29:27 np0005603500 nova_compute[182934]: 2026-01-31 06:29:27.454 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:29:29 np0005603500 nova_compute[182934]: 2026-01-31 06:29:29.793 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:29:29 np0005603500 nova_compute[182934]: 2026-01-31 06:29:29.793 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:29:29 np0005603500 nova_compute[182934]: 2026-01-31 06:29:29.794 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:29:29 np0005603500 nova_compute[182934]: 2026-01-31 06:29:29.794 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:29:29 np0005603500 nova_compute[182934]: 2026-01-31 06:29:29.794 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:29:29 np0005603500 nova_compute[182934]: 2026-01-31 06:29:29.794 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:29:29 np0005603500 nova_compute[182934]: 2026-01-31 06:29:29.794 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:29:29 np0005603500 nova_compute[182934]: 2026-01-31 06:29:29.794 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:29:30 np0005603500 nova_compute[182934]: 2026-01-31 06:29:30.311 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:29:30 np0005603500 nova_compute[182934]: 2026-01-31 06:29:30.312 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:29:30 np0005603500 nova_compute[182934]: 2026-01-31 06:29:30.312 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:29:30 np0005603500 nova_compute[182934]: 2026-01-31 06:29:30.312 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:29:30 np0005603500 nova_compute[182934]: 2026-01-31 06:29:30.446 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:29:30 np0005603500 nova_compute[182934]: 2026-01-31 06:29:30.447 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5946MB free_disk=73.25014114379883GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:29:30 np0005603500 nova_compute[182934]: 2026-01-31 06:29:30.448 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:29:30 np0005603500 nova_compute[182934]: 2026-01-31 06:29:30.448 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:29:31 np0005603500 nova_compute[182934]: 2026-01-31 06:29:31.537 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:29:31 np0005603500 nova_compute[182934]: 2026-01-31 06:29:31.537 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:29:31 np0005603500 nova_compute[182934]: 2026-01-31 06:29:31.560 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:29:32 np0005603500 nova_compute[182934]: 2026-01-31 06:29:32.070 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:29:32 np0005603500 nova_compute[182934]: 2026-01-31 06:29:32.072 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:29:32 np0005603500 nova_compute[182934]: 2026-01-31 06:29:32.072 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.624s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:29:34 np0005603500 podman[211184]: 2026-01-31 06:29:34.131191059 +0000 UTC m=+0.052245801 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent)
Jan 31 01:29:36 np0005603500 podman[211203]: 2026-01-31 06:29:36.123382652 +0000 UTC m=+0.045511711 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:29:42 np0005603500 podman[211226]: 2026-01-31 06:29:42.167020419 +0000 UTC m=+0.087793524 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.7, io.openshift.expose-services=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 31 01:29:46 np0005603500 podman[211247]: 2026-01-31 06:29:46.175653733 +0000 UTC m=+0.093658156 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 31 01:29:51 np0005603500 podman[211273]: 2026-01-31 06:29:51.143705706 +0000 UTC m=+0.061910223 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:29:53 np0005603500 podman[211294]: 2026-01-31 06:29:53.12766069 +0000 UTC m=+0.048747335 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 31 01:29:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:29:55.417 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:29:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:29:55.417 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:29:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:29:55.417 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:30:05 np0005603500 podman[211319]: 2026-01-31 06:30:05.149559328 +0000 UTC m=+0.073710417 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true)
Jan 31 01:30:07 np0005603500 podman[211337]: 2026-01-31 06:30:07.125580275 +0000 UTC m=+0.042894053 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 31 01:30:13 np0005603500 podman[211361]: 2026-01-31 06:30:13.120461777 +0000 UTC m=+0.044215576 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1769056855, build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64, version=9.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7)
Jan 31 01:30:17 np0005603500 podman[211383]: 2026-01-31 06:30:17.138002388 +0000 UTC m=+0.061467543 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.981 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f199f44d220>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.983 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.983 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f19a53f3b80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.983 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.983 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f199f44d160>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.983 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.983 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f199f44d940>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.983 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.983 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f199f44d2e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.983 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f199f43b490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f199f43b700>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f199f43baf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f199f43bbe0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f199f43b0d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f199f44db50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f199f44d4f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f199f451250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f199f44d040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f199f43b340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f199f44dc10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f199f43bca0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f199f43bdf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f199f44d3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f199f436bb0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f199f44d760>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f199f44d6a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f199f43b550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f199f44dcd0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f199f44d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f199f43b3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:30:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:30:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:30:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:30:19.977 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:30:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:30:19.978 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:30:21 np0005603500 nova_compute[182934]: 2026-01-31 06:30:21.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:30:21 np0005603500 nova_compute[182934]: 2026-01-31 06:30:21.148 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11834
Jan 31 01:30:21 np0005603500 nova_compute[182934]: 2026-01-31 06:30:21.666 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11843
Jan 31 01:30:21 np0005603500 nova_compute[182934]: 2026-01-31 06:30:21.667 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:30:21 np0005603500 nova_compute[182934]: 2026-01-31 06:30:21.667 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11872
Jan 31 01:30:22 np0005603500 podman[211408]: 2026-01-31 06:30:22.137722002 +0000 UTC m=+0.054618050 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 31 01:30:22 np0005603500 nova_compute[182934]: 2026-01-31 06:30:22.449 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:30:24 np0005603500 nova_compute[182934]: 2026-01-31 06:30:24.002 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:30:24 np0005603500 podman[211428]: 2026-01-31 06:30:24.134505983 +0000 UTC m=+0.054083651 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:30:24 np0005603500 nova_compute[182934]: 2026-01-31 06:30:24.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:30:24 np0005603500 nova_compute[182934]: 2026-01-31 06:30:24.671 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:30:24 np0005603500 nova_compute[182934]: 2026-01-31 06:30:24.671 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:30:24 np0005603500 nova_compute[182934]: 2026-01-31 06:30:24.672 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:30:24 np0005603500 nova_compute[182934]: 2026-01-31 06:30:24.672 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:30:24 np0005603500 nova_compute[182934]: 2026-01-31 06:30:24.852 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:30:24 np0005603500 nova_compute[182934]: 2026-01-31 06:30:24.853 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5984MB free_disk=73.25014114379883GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:30:24 np0005603500 nova_compute[182934]: 2026-01-31 06:30:24.854 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:30:24 np0005603500 nova_compute[182934]: 2026-01-31 06:30:24.854 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:30:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:30:24.979 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:30:25 np0005603500 nova_compute[182934]: 2026-01-31 06:30:25.912 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:30:25 np0005603500 nova_compute[182934]: 2026-01-31 06:30:25.913 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:30:25 np0005603500 nova_compute[182934]: 2026-01-31 06:30:25.939 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:30:26 np0005603500 nova_compute[182934]: 2026-01-31 06:30:26.458 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:30:26 np0005603500 nova_compute[182934]: 2026-01-31 06:30:26.460 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:30:26 np0005603500 nova_compute[182934]: 2026-01-31 06:30:26.460 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:30:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:30:27.253 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ca:96:78 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=06664cb8-3ec3-4b12-9420-23c1bc38e360, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=bbc7e9d4-0e42-4731-a0a1-d6912b3f33d9) old=Port_Binding(mac=['fa:16:3e:ca:96:78'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:30:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:30:27.254 104644 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port bbc7e9d4-0e42-4731-a0a1-d6912b3f33d9 in datapath 349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63 updated
Jan 31 01:30:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:30:27.256 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:30:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:30:27.257 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[c48ce981-1e89-4c49-aff4-e6f6683fa05d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:30:27 np0005603500 nova_compute[182934]: 2026-01-31 06:30:27.460 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:30:27 np0005603500 nova_compute[182934]: 2026-01-31 06:30:27.461 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:30:27 np0005603500 nova_compute[182934]: 2026-01-31 06:30:27.461 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:30:27 np0005603500 nova_compute[182934]: 2026-01-31 06:30:27.461 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:30:27 np0005603500 nova_compute[182934]: 2026-01-31 06:30:27.461 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:30:27 np0005603500 nova_compute[182934]: 2026-01-31 06:30:27.461 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:30:28 np0005603500 nova_compute[182934]: 2026-01-31 06:30:28.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:30:36 np0005603500 podman[211452]: 2026-01-31 06:30:36.129962824 +0000 UTC m=+0.047782504 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 01:30:38 np0005603500 podman[211471]: 2026-01-31 06:30:38.132877105 +0000 UTC m=+0.055233459 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:30:44 np0005603500 podman[211495]: 2026-01-31 06:30:44.125383821 +0000 UTC m=+0.048026492 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1769056855, build-date=2026-01-22T05:09:47Z, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.7, io.buildah.version=1.33.7)
Jan 31 01:30:48 np0005603500 podman[211516]: 2026-01-31 06:30:48.136356555 +0000 UTC m=+0.060139170 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 01:30:53 np0005603500 podman[211542]: 2026-01-31 06:30:53.122408949 +0000 UTC m=+0.046441700 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Jan 31 01:30:55 np0005603500 nova_compute[182934]: 2026-01-31 06:30:55.006 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "7dedc0e6-e769-4fda-b465-152126c73743" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:30:55 np0005603500 nova_compute[182934]: 2026-01-31 06:30:55.006 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "7dedc0e6-e769-4fda-b465-152126c73743" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:30:55 np0005603500 podman[211564]: 2026-01-31 06:30:55.120377469 +0000 UTC m=+0.036039148 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 31 01:30:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:30:55.477 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:30:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:30:55.477 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:30:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:30:55.477 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:30:55 np0005603500 nova_compute[182934]: 2026-01-31 06:30:55.580 182938 DEBUG nova.compute.manager [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Jan 31 01:30:56 np0005603500 nova_compute[182934]: 2026-01-31 06:30:56.958 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:30:56 np0005603500 nova_compute[182934]: 2026-01-31 06:30:56.959 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:30:56 np0005603500 nova_compute[182934]: 2026-01-31 06:30:56.966 182938 DEBUG nova.virt.hardware [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Jan 31 01:30:56 np0005603500 nova_compute[182934]: 2026-01-31 06:30:56.966 182938 INFO nova.compute.claims [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Claim successful on node compute-0.ctlplane.example.com
Jan 31 01:30:58 np0005603500 nova_compute[182934]: 2026-01-31 06:30:58.263 182938 DEBUG nova.scheduler.client.report [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Refreshing inventories for resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:822
Jan 31 01:30:58 np0005603500 nova_compute[182934]: 2026-01-31 06:30:58.482 182938 DEBUG nova.scheduler.client.report [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Updating ProviderTree inventory for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:786
Jan 31 01:30:58 np0005603500 nova_compute[182934]: 2026-01-31 06:30:58.482 182938 DEBUG nova.compute.provider_tree [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Updating inventory in ProviderTree for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 31 01:30:58 np0005603500 nova_compute[182934]: 2026-01-31 06:30:58.497 182938 DEBUG nova.scheduler.client.report [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Refreshing aggregate associations for resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:831
Jan 31 01:30:58 np0005603500 nova_compute[182934]: 2026-01-31 06:30:58.522 182938 DEBUG nova.scheduler.client.report [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Refreshing trait associations for resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59, traits: COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,HW_CPU_X86_BMI2,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,HW_ARCH_X86_64,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_ABM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_CRB,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_SHA,HW_CPU_X86_CLMUL,HW_CPU_X86_AVX,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_TIS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_FMA3,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SVM,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:843
Jan 31 01:30:58 np0005603500 nova_compute[182934]: 2026-01-31 06:30:58.781 182938 DEBUG nova.compute.provider_tree [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:30:59 np0005603500 nova_compute[182934]: 2026-01-31 06:30:59.300 182938 DEBUG nova.scheduler.client.report [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:30:59 np0005603500 nova_compute[182934]: 2026-01-31 06:30:59.887 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.928s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:30:59 np0005603500 nova_compute[182934]: 2026-01-31 06:30:59.888 182938 DEBUG nova.compute.manager [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Jan 31 01:31:00 np0005603500 nova_compute[182934]: 2026-01-31 06:31:00.462 182938 DEBUG nova.compute.manager [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Jan 31 01:31:00 np0005603500 nova_compute[182934]: 2026-01-31 06:31:00.462 182938 DEBUG nova.network.neutron [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Jan 31 01:31:00 np0005603500 nova_compute[182934]: 2026-01-31 06:31:00.981 182938 INFO nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 31 01:31:01 np0005603500 nova_compute[182934]: 2026-01-31 06:31:01.509 182938 DEBUG nova.compute.manager [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Jan 31 01:31:02 np0005603500 nova_compute[182934]: 2026-01-31 06:31:02.537 182938 DEBUG nova.compute.manager [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Jan 31 01:31:02 np0005603500 nova_compute[182934]: 2026-01-31 06:31:02.539 182938 DEBUG nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Jan 31 01:31:02 np0005603500 nova_compute[182934]: 2026-01-31 06:31:02.539 182938 INFO nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Creating image(s)
Jan 31 01:31:02 np0005603500 nova_compute[182934]: 2026-01-31 06:31:02.540 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "/var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:31:02 np0005603500 nova_compute[182934]: 2026-01-31 06:31:02.540 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:31:02 np0005603500 nova_compute[182934]: 2026-01-31 06:31:02.541 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:31:02 np0005603500 nova_compute[182934]: 2026-01-31 06:31:02.541 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "d9035e96dc857b84194c2a2b496d294827e2de39" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:31:02 np0005603500 nova_compute[182934]: 2026-01-31 06:31:02.541 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:31:02 np0005603500 nova_compute[182934]: 2026-01-31 06:31:02.571 182938 DEBUG nova.policy [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '829310cd8381494e96216dba067ff8d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Jan 31 01:31:06 np0005603500 nova_compute[182934]: 2026-01-31 06:31:05.999 182938 DEBUG nova.network.neutron [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Successfully created port: 57786dbf-8d90-4cbe-834d-3cd072a75d1f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 31 01:31:07 np0005603500 podman[211590]: 2026-01-31 06:31:07.126144462 +0000 UTC m=+0.049184278 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 01:31:07 np0005603500 nova_compute[182934]: 2026-01-31 06:31:07.472 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'QFI\xfb') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:31:07 np0005603500 nova_compute[182934]: 2026-01-31 06:31:07.476 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:31:07 np0005603500 nova_compute[182934]: 2026-01-31 06:31:07.476 182938 DEBUG oslo_concurrency.processutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:31:07 np0005603500 nova_compute[182934]: 2026-01-31 06:31:07.522 182938 DEBUG oslo_concurrency.processutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39.part --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:31:07 np0005603500 nova_compute[182934]: 2026-01-31 06:31:07.523 182938 DEBUG nova.virt.images [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] 9f613975-b701-42a0-9b35-7d5c4a2cb7f2 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:278
Jan 31 01:31:07 np0005603500 nova_compute[182934]: 2026-01-31 06:31:07.660 182938 DEBUG nova.privsep.utils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 31 01:31:07 np0005603500 nova_compute[182934]: 2026-01-31 06:31:07.661 182938 DEBUG oslo_concurrency.processutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39.part /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:31:09 np0005603500 podman[211621]: 2026-01-31 06:31:09.143211143 +0000 UTC m=+0.061874975 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.061 182938 DEBUG oslo_concurrency.processutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39.part /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39.converted" returned: 0 in 2.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.067 182938 DEBUG oslo_concurrency.processutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.143 182938 DEBUG oslo_concurrency.processutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39.converted --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.145 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 7.604s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.146 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.151 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.153 182938 INFO oslo.privsep.daemon [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpqin0qnp_/privsep.sock']
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.768 182938 INFO oslo.privsep.daemon [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Spawned new privsep daemon via rootwrap
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.648 211654 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.651 211654 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.653 211654 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.653 211654 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211654
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.855 182938 DEBUG oslo_concurrency.processutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.908 182938 DEBUG oslo_concurrency.processutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.909 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "d9035e96dc857b84194c2a2b496d294827e2de39" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.910 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.911 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.914 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.914 182938 DEBUG oslo_concurrency.processutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.966 182938 DEBUG oslo_concurrency.processutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:31:10 np0005603500 nova_compute[182934]: 2026-01-31 06:31:10.967 182938 DEBUG oslo_concurrency.processutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:31:11 np0005603500 nova_compute[182934]: 2026-01-31 06:31:11.128 182938 DEBUG oslo_concurrency.processutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk 1073741824" returned: 0 in 0.160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:31:11 np0005603500 nova_compute[182934]: 2026-01-31 06:31:11.129 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:31:11 np0005603500 nova_compute[182934]: 2026-01-31 06:31:11.129 182938 DEBUG oslo_concurrency.processutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:31:11 np0005603500 nova_compute[182934]: 2026-01-31 06:31:11.189 182938 DEBUG oslo_concurrency.processutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:31:11 np0005603500 nova_compute[182934]: 2026-01-31 06:31:11.190 182938 DEBUG nova.virt.disk.api [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Checking if we can resize image /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Jan 31 01:31:11 np0005603500 nova_compute[182934]: 2026-01-31 06:31:11.191 182938 DEBUG oslo_concurrency.processutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:31:11 np0005603500 nova_compute[182934]: 2026-01-31 06:31:11.250 182938 DEBUG oslo_concurrency.processutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:31:11 np0005603500 nova_compute[182934]: 2026-01-31 06:31:11.251 182938 DEBUG nova.virt.disk.api [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Cannot resize image /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Jan 31 01:31:11 np0005603500 nova_compute[182934]: 2026-01-31 06:31:11.252 182938 DEBUG nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Jan 31 01:31:11 np0005603500 nova_compute[182934]: 2026-01-31 06:31:11.253 182938 DEBUG nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Ensure instance console log exists: /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Jan 31 01:31:11 np0005603500 nova_compute[182934]: 2026-01-31 06:31:11.253 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:31:11 np0005603500 nova_compute[182934]: 2026-01-31 06:31:11.254 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:31:11 np0005603500 nova_compute[182934]: 2026-01-31 06:31:11.254 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:31:11 np0005603500 nova_compute[182934]: 2026-01-31 06:31:11.350 182938 DEBUG nova.network.neutron [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Successfully updated port: 57786dbf-8d90-4cbe-834d-3cd072a75d1f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 31 01:31:11 np0005603500 nova_compute[182934]: 2026-01-31 06:31:11.881 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "refresh_cache-7dedc0e6-e769-4fda-b465-152126c73743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:31:11 np0005603500 nova_compute[182934]: 2026-01-31 06:31:11.882 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquired lock "refresh_cache-7dedc0e6-e769-4fda-b465-152126c73743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:31:11 np0005603500 nova_compute[182934]: 2026-01-31 06:31:11.882 182938 DEBUG nova.network.neutron [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Jan 31 01:31:12 np0005603500 nova_compute[182934]: 2026-01-31 06:31:12.301 182938 DEBUG nova.compute.manager [req-256c67d9-093c-4860-a7a7-2bf03567cea5 req-c31b0e6d-ee1a-446f-a88c-367512cc746f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Received event network-changed-57786dbf-8d90-4cbe-834d-3cd072a75d1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:31:12 np0005603500 nova_compute[182934]: 2026-01-31 06:31:12.302 182938 DEBUG nova.compute.manager [req-256c67d9-093c-4860-a7a7-2bf03567cea5 req-c31b0e6d-ee1a-446f-a88c-367512cc746f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Refreshing instance network info cache due to event network-changed-57786dbf-8d90-4cbe-834d-3cd072a75d1f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:31:12 np0005603500 nova_compute[182934]: 2026-01-31 06:31:12.302 182938 DEBUG oslo_concurrency.lockutils [req-256c67d9-093c-4860-a7a7-2bf03567cea5 req-c31b0e6d-ee1a-446f-a88c-367512cc746f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-7dedc0e6-e769-4fda-b465-152126c73743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:31:13 np0005603500 nova_compute[182934]: 2026-01-31 06:31:13.553 182938 DEBUG nova.network.neutron [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:31:15 np0005603500 podman[211672]: 2026-01-31 06:31:15.145359834 +0000 UTC m=+0.061158134 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, container_name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, vcs-type=git, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, release=1769056855, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.046 182938 DEBUG nova.network.neutron [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Updating instance_info_cache with network_info: [{"id": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "address": "fa:16:3e:dc:ad:4e", "network": {"id": "349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63", "bridge": "br-int", "label": "tempest-network-smoke--1435648712", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57786dbf-8d", "ovs_interfaceid": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.567 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Releasing lock "refresh_cache-7dedc0e6-e769-4fda-b465-152126c73743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.568 182938 DEBUG nova.compute.manager [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Instance network_info: |[{"id": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "address": "fa:16:3e:dc:ad:4e", "network": {"id": "349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63", "bridge": "br-int", "label": "tempest-network-smoke--1435648712", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57786dbf-8d", "ovs_interfaceid": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.568 182938 DEBUG oslo_concurrency.lockutils [req-256c67d9-093c-4860-a7a7-2bf03567cea5 req-c31b0e6d-ee1a-446f-a88c-367512cc746f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-7dedc0e6-e769-4fda-b465-152126c73743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.568 182938 DEBUG nova.network.neutron [req-256c67d9-093c-4860-a7a7-2bf03567cea5 req-c31b0e6d-ee1a-446f-a88c-367512cc746f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Refreshing network info cache for port 57786dbf-8d90-4cbe-834d-3cd072a75d1f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.572 182938 DEBUG nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Start _get_guest_xml network_info=[{"id": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "address": "fa:16:3e:dc:ad:4e", "network": {"id": "349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63", "bridge": "br-int", "label": "tempest-network-smoke--1435648712", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57786dbf-8d", "ovs_interfaceid": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': '9f613975-b701-42a0-9b35-7d5c4a2cb7f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.578 182938 WARNING nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.586 182938 DEBUG nova.virt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-132386650', uuid='7dedc0e6-e769-4fda-b465-152126c73743'), owner=OwnerMeta(userid='dddc34b0385a49a5bd9bf081ed29e9fd', username='tempest-TestNetworkBasicOps-1355800406-project-member', projectid='829310cd8381494e96216dba067ff8d3', projectname='tempest-TestNetworkBasicOps-1355800406'), image=ImageMeta(id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "address": "fa:16:3e:dc:ad:4e", "network": {"id": "349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63", "bridge": "br-int", "label": "tempest-network-smoke--1435648712", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57786dbf-8d", "ovs_interfaceid": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1769841077.586156) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.598 182938 DEBUG nova.virt.libvirt.host [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.598 182938 DEBUG nova.virt.libvirt.host [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.603 182938 DEBUG nova.virt.libvirt.host [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.603 182938 DEBUG nova.virt.libvirt.host [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.604 182938 DEBUG nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.604 182938 DEBUG nova.virt.hardware [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T06:29:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9956992e-a3ca-497f-9747-3ae270e07def',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.605 182938 DEBUG nova.virt.hardware [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.605 182938 DEBUG nova.virt.hardware [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.605 182938 DEBUG nova.virt.hardware [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.605 182938 DEBUG nova.virt.hardware [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.605 182938 DEBUG nova.virt.hardware [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.606 182938 DEBUG nova.virt.hardware [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.606 182938 DEBUG nova.virt.hardware [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.606 182938 DEBUG nova.virt.hardware [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.607 182938 DEBUG nova.virt.hardware [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.607 182938 DEBUG nova.virt.hardware [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.645 182938 DEBUG nova.privsep.utils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.647 182938 DEBUG nova.virt.libvirt.vif [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:30:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-132386650',display_name='tempest-TestNetworkBasicOps-server-132386650',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-132386650',id=1,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGw41nQMJfT2kxXNRqmgCqTA3vngkKt8B3ulIHkgKcd42+FYSdC0j1jZchA3NNtcC9su1Z4mbyf3ZR6prbQi5Gh07jOCnjQDe+eIAPeL02ydcm3jjG1oX1Ppzv7y0nED0g==',key_name='tempest-TestNetworkBasicOps-78724422',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-r0f9rvd2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:31:01Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=7dedc0e6-e769-4fda-b465-152126c73743,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "address": "fa:16:3e:dc:ad:4e", "network": {"id": "349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63", "bridge": "br-int", "label": "tempest-network-smoke--1435648712", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57786dbf-8d", "ovs_interfaceid": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.647 182938 DEBUG nova.network.os_vif_util [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "address": "fa:16:3e:dc:ad:4e", "network": {"id": "349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63", "bridge": "br-int", "label": "tempest-network-smoke--1435648712", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57786dbf-8d", "ovs_interfaceid": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.648 182938 DEBUG nova.network.os_vif_util [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dc:ad:4e,bridge_name='br-int',has_traffic_filtering=True,id=57786dbf-8d90-4cbe-834d-3cd072a75d1f,network=Network(349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57786dbf-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:31:17 np0005603500 nova_compute[182934]: 2026-01-31 06:31:17.650 182938 DEBUG nova.objects.instance [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7dedc0e6-e769-4fda-b465-152126c73743 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.269 182938 DEBUG nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] End _get_guest_xml xml=<domain type="kvm">
Jan 31 01:31:18 np0005603500 nova_compute[182934]:  <uuid>7dedc0e6-e769-4fda-b465-152126c73743</uuid>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:  <name>instance-00000001</name>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:  <memory>131072</memory>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:  <vcpu>1</vcpu>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <nova:name>tempest-TestNetworkBasicOps-server-132386650</nova:name>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <nova:creationTime>2026-01-31 06:31:17</nova:creationTime>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <nova:flavor name="m1.nano">
Jan 31 01:31:18 np0005603500 nova_compute[182934]:        <nova:memory>128</nova:memory>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:        <nova:disk>1</nova:disk>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:        <nova:swap>0</nova:swap>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:        <nova:vcpus>1</nova:vcpus>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      </nova:flavor>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <nova:owner>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:        <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:        <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      </nova:owner>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <nova:ports>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:        <nova:port uuid="57786dbf-8d90-4cbe-834d-3cd072a75d1f">
Jan 31 01:31:18 np0005603500 nova_compute[182934]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:        </nova:port>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      </nova:ports>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    </nova:instance>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:  <sysinfo type="smbios">
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <entry name="manufacturer">RDO</entry>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <entry name="product">OpenStack Compute</entry>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <entry name="serial">7dedc0e6-e769-4fda-b465-152126c73743</entry>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <entry name="uuid">7dedc0e6-e769-4fda-b465-152126c73743</entry>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <entry name="family">Virtual Machine</entry>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <boot dev="hd"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <smbios mode="sysinfo"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <vmcoreinfo/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:  <clock offset="utc">
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <timer name="hpet" present="no"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:  <cpu mode="host-model" match="exact">
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <disk type="file" device="disk">
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <target dev="vda" bus="virtio"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <disk type="file" device="cdrom">
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <driver name="qemu" type="raw" cache="none"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk.config"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <target dev="sda" bus="sata"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <interface type="ethernet">
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <mac address="fa:16:3e:dc:ad:4e"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <mtu size="1442"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <target dev="tap57786dbf-8d"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <serial type="pty">
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <log file="/var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/console.log" append="off"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <input type="tablet" bus="usb"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <rng model="virtio">
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <backend model="random">/dev/urandom</backend>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <controller type="usb" index="0"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    <memballoon model="virtio">
Jan 31 01:31:18 np0005603500 nova_compute[182934]:      <stats period="10"/>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:31:18 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:31:18 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:31:18 np0005603500 nova_compute[182934]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.270 182938 DEBUG nova.compute.manager [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Preparing to wait for external event network-vif-plugged-57786dbf-8d90-4cbe-834d-3cd072a75d1f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.270 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "7dedc0e6-e769-4fda-b465-152126c73743-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.270 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "7dedc0e6-e769-4fda-b465-152126c73743-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.270 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "7dedc0e6-e769-4fda-b465-152126c73743-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.271 182938 DEBUG nova.virt.libvirt.vif [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:30:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-132386650',display_name='tempest-TestNetworkBasicOps-server-132386650',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-132386650',id=1,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGw41nQMJfT2kxXNRqmgCqTA3vngkKt8B3ulIHkgKcd42+FYSdC0j1jZchA3NNtcC9su1Z4mbyf3ZR6prbQi5Gh07jOCnjQDe+eIAPeL02ydcm3jjG1oX1Ppzv7y0nED0g==',key_name='tempest-TestNetworkBasicOps-78724422',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-r0f9rvd2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:31:01Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=7dedc0e6-e769-4fda-b465-152126c73743,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "address": "fa:16:3e:dc:ad:4e", "network": {"id": "349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63", "bridge": "br-int", "label": "tempest-network-smoke--1435648712", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57786dbf-8d", "ovs_interfaceid": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.271 182938 DEBUG nova.network.os_vif_util [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "address": "fa:16:3e:dc:ad:4e", "network": {"id": "349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63", "bridge": "br-int", "label": "tempest-network-smoke--1435648712", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57786dbf-8d", "ovs_interfaceid": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.272 182938 DEBUG nova.network.os_vif_util [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dc:ad:4e,bridge_name='br-int',has_traffic_filtering=True,id=57786dbf-8d90-4cbe-834d-3cd072a75d1f,network=Network(349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57786dbf-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.272 182938 DEBUG os_vif [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dc:ad:4e,bridge_name='br-int',has_traffic_filtering=True,id=57786dbf-8d90-4cbe-834d-3cd072a75d1f,network=Network(349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57786dbf-8d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.317 182938 DEBUG ovsdbapp.backend.ovs_idl [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.317 182938 DEBUG ovsdbapp.backend.ovs_idl [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.317 182938 DEBUG ovsdbapp.backend.ovs_idl [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.318 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.319 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.319 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.319 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.320 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.322 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.329 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.330 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.330 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.331 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.331 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '6ef36b2b-75ce-5606-8ef7-7486c7390062', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.332 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.333 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:31:18 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.335 182938 INFO oslo.privsep.daemon [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpc6jebvae/privsep.sock']
Jan 31 01:31:19 np0005603500 nova_compute[182934]: 2026-01-31 06:31:19.008 182938 INFO oslo.privsep.daemon [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Spawned new privsep daemon via rootwrap
Jan 31 01:31:19 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.831 211698 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 31 01:31:19 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.835 211698 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 31 01:31:19 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.838 211698 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Jan 31 01:31:19 np0005603500 nova_compute[182934]: 2026-01-31 06:31:18.838 211698 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211698
Jan 31 01:31:19 np0005603500 podman[211700]: 2026-01-31 06:31:19.14375795 +0000 UTC m=+0.066423051 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260127, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 01:31:19 np0005603500 nova_compute[182934]: 2026-01-31 06:31:19.297 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:19 np0005603500 nova_compute[182934]: 2026-01-31 06:31:19.297 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap57786dbf-8d, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:31:19 np0005603500 nova_compute[182934]: 2026-01-31 06:31:19.298 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap57786dbf-8d, col_values=(('qos', UUID('4006599b-ba02-4e30-8a89-0d7fe1197cf3')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:31:19 np0005603500 nova_compute[182934]: 2026-01-31 06:31:19.299 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap57786dbf-8d, col_values=(('external_ids', {'iface-id': '57786dbf-8d90-4cbe-834d-3cd072a75d1f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dc:ad:4e', 'vm-uuid': '7dedc0e6-e769-4fda-b465-152126c73743'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:31:19 np0005603500 NetworkManager[55506]: <info>  [1769841079.3017] manager: (tap57786dbf-8d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 31 01:31:19 np0005603500 nova_compute[182934]: 2026-01-31 06:31:19.304 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:31:19 np0005603500 nova_compute[182934]: 2026-01-31 06:31:19.307 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:19 np0005603500 nova_compute[182934]: 2026-01-31 06:31:19.308 182938 INFO os_vif [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dc:ad:4e,bridge_name='br-int',has_traffic_filtering=True,id=57786dbf-8d90-4cbe-834d-3cd072a75d1f,network=Network(349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57786dbf-8d')
Jan 31 01:31:21 np0005603500 nova_compute[182934]: 2026-01-31 06:31:21.034 182938 DEBUG nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:31:21 np0005603500 nova_compute[182934]: 2026-01-31 06:31:21.035 182938 DEBUG nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:31:21 np0005603500 nova_compute[182934]: 2026-01-31 06:31:21.035 182938 DEBUG nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No VIF found with MAC fa:16:3e:dc:ad:4e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Jan 31 01:31:21 np0005603500 nova_compute[182934]: 2026-01-31 06:31:21.036 182938 INFO nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Using config drive
Jan 31 01:31:22 np0005603500 nova_compute[182934]: 2026-01-31 06:31:22.720 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:22 np0005603500 nova_compute[182934]: 2026-01-31 06:31:22.879 182938 DEBUG nova.network.neutron [req-256c67d9-093c-4860-a7a7-2bf03567cea5 req-c31b0e6d-ee1a-446f-a88c-367512cc746f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Updated VIF entry in instance network info cache for port 57786dbf-8d90-4cbe-834d-3cd072a75d1f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:31:22 np0005603500 nova_compute[182934]: 2026-01-31 06:31:22.879 182938 DEBUG nova.network.neutron [req-256c67d9-093c-4860-a7a7-2bf03567cea5 req-c31b0e6d-ee1a-446f-a88c-367512cc746f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Updating instance_info_cache with network_info: [{"id": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "address": "fa:16:3e:dc:ad:4e", "network": {"id": "349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63", "bridge": "br-int", "label": "tempest-network-smoke--1435648712", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57786dbf-8d", "ovs_interfaceid": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:31:23 np0005603500 nova_compute[182934]: 2026-01-31 06:31:23.148 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:31:23 np0005603500 nova_compute[182934]: 2026-01-31 06:31:23.503 182938 DEBUG oslo_concurrency.lockutils [req-256c67d9-093c-4860-a7a7-2bf03567cea5 req-c31b0e6d-ee1a-446f-a88c-367512cc746f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-7dedc0e6-e769-4fda-b465-152126c73743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:31:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:24.134 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:31:24 np0005603500 nova_compute[182934]: 2026-01-31 06:31:24.134 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:24.135 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:31:24 np0005603500 nova_compute[182934]: 2026-01-31 06:31:24.151 182938 INFO nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Creating config drive at /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk.config
Jan 31 01:31:24 np0005603500 nova_compute[182934]: 2026-01-31 06:31:24.158 182938 DEBUG oslo_concurrency.processutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpy7szyuhp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:31:24 np0005603500 nova_compute[182934]: 2026-01-31 06:31:24.178 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:31:24 np0005603500 podman[211733]: 2026-01-31 06:31:24.204870142 +0000 UTC m=+0.111772085 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 31 01:31:24 np0005603500 nova_compute[182934]: 2026-01-31 06:31:24.284 182938 DEBUG oslo_concurrency.processutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpy7szyuhp" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:31:24 np0005603500 nova_compute[182934]: 2026-01-31 06:31:24.301 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:24 np0005603500 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 31 01:31:24 np0005603500 kernel: tap57786dbf-8d: entered promiscuous mode
Jan 31 01:31:24 np0005603500 NetworkManager[55506]: <info>  [1769841084.3477] manager: (tap57786dbf-8d): new Tun device (/org/freedesktop/NetworkManager/Devices/20)
Jan 31 01:31:24 np0005603500 ovn_controller[95398]: 2026-01-31T06:31:24Z|00040|binding|INFO|Claiming lport 57786dbf-8d90-4cbe-834d-3cd072a75d1f for this chassis.
Jan 31 01:31:24 np0005603500 ovn_controller[95398]: 2026-01-31T06:31:24Z|00041|binding|INFO|57786dbf-8d90-4cbe-834d-3cd072a75d1f: Claiming fa:16:3e:dc:ad:4e 10.100.0.14
Jan 31 01:31:24 np0005603500 nova_compute[182934]: 2026-01-31 06:31:24.349 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:24 np0005603500 nova_compute[182934]: 2026-01-31 06:31:24.351 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:24 np0005603500 systemd-udevd[211776]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:31:24 np0005603500 ovn_controller[95398]: 2026-01-31T06:31:24Z|00042|binding|INFO|Setting lport 57786dbf-8d90-4cbe-834d-3cd072a75d1f ovn-installed in OVS
Jan 31 01:31:24 np0005603500 nova_compute[182934]: 2026-01-31 06:31:24.380 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:24 np0005603500 nova_compute[182934]: 2026-01-31 06:31:24.382 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:24 np0005603500 NetworkManager[55506]: <info>  [1769841084.3881] device (tap57786dbf-8d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:31:24 np0005603500 NetworkManager[55506]: <info>  [1769841084.3890] device (tap57786dbf-8d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 01:31:24 np0005603500 systemd-machined[154375]: New machine qemu-1-instance-00000001.
Jan 31 01:31:24 np0005603500 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Jan 31 01:31:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:24.436 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dc:ad:4e 10.100.0.14'], port_security=['fa:16:3e:dc:ad:4e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7dedc0e6-e769-4fda-b465-152126c73743', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b742a00d-33bc-4f25-9899-8560eae25dc3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=06664cb8-3ec3-4b12-9420-23c1bc38e360, chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=57786dbf-8d90-4cbe-834d-3cd072a75d1f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:31:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:24.437 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 57786dbf-8d90-4cbe-834d-3cd072a75d1f in datapath 349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63 bound to our chassis
Jan 31 01:31:24 np0005603500 ovn_controller[95398]: 2026-01-31T06:31:24Z|00043|binding|INFO|Setting lport 57786dbf-8d90-4cbe-834d-3cd072a75d1f up in Southbound
Jan 31 01:31:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:24.439 104644 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63
Jan 31 01:31:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:24.461 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[fde0ac48-02ee-4cf3-a0e2-67b7550744fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:31:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:24.462 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap349f85d7-91 in ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Jan 31 01:31:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:24.465 210946 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap349f85d7-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 31 01:31:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:24.465 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[415eb761-6def-45a6-9014-df2e71b8c1be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:31:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:24.466 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[ac7b5562-4a3b-4c46-a576-b6f53977df70]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:31:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:24.482 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[c88a3c92-0f87-41eb-9cfb-5094055ce361]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:31:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:24.497 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[a2af26c4-7acc-4edf-a9dd-44496e4ba81d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:31:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:24.500 104644 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmphnzhpuni/privsep.sock']
Jan 31 01:31:24 np0005603500 nova_compute[182934]: 2026-01-31 06:31:24.777 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:31:24 np0005603500 nova_compute[182934]: 2026-01-31 06:31:24.779 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:31:24 np0005603500 nova_compute[182934]: 2026-01-31 06:31:24.779 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:31:24 np0005603500 nova_compute[182934]: 2026-01-31 06:31:24.779 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:31:25 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:25.234 104644 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 31 01:31:25 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:25.235 104644 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmphnzhpuni/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:366
Jan 31 01:31:25 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:25.106 211809 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 31 01:31:25 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:25.109 211809 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 31 01:31:25 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:25.111 211809 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 31 01:31:25 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:25.111 211809 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211809
Jan 31 01:31:25 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:25.237 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[89edf6e1-4968-48d9-b14b-1d081d6f3dee]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:31:25 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:25.738 211809 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:31:25 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:25.738 211809 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:31:25 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:25.739 211809 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:31:26 np0005603500 nova_compute[182934]: 2026-01-31 06:31:26.002 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:31:26 np0005603500 nova_compute[182934]: 2026-01-31 06:31:26.047 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:31:26 np0005603500 nova_compute[182934]: 2026-01-31 06:31:26.049 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:31:26 np0005603500 nova_compute[182934]: 2026-01-31 06:31:26.126 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:31:26 np0005603500 podman[211820]: 2026-01-31 06:31:26.160896174 +0000 UTC m=+0.077668682 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.238 211809 INFO oslo_service.backend [-] Loading backend: eventlet
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.243 211809 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Jan 31 01:31:26 np0005603500 nova_compute[182934]: 2026-01-31 06:31:26.259 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:31:26 np0005603500 nova_compute[182934]: 2026-01-31 06:31:26.261 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5812MB free_disk=73.21533584594727GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:31:26 np0005603500 nova_compute[182934]: 2026-01-31 06:31:26.261 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:31:26 np0005603500 nova_compute[182934]: 2026-01-31 06:31:26.262 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.308 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[f07de9f2-c4f7-4cc0-b9a4-a7b0457faf95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:31:26 np0005603500 systemd-udevd[211775]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.330 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[fef71ee7-64ba-4ef2-8286-dd6f6d52f53c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:31:26 np0005603500 NetworkManager[55506]: <info>  [1769841086.3315] manager: (tap349f85d7-90): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.361 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[9e009d79-61c7-4733-b91e-c691637e029e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.364 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[e0175848-372a-4bea-823d-44d6ac5ee46f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:31:26 np0005603500 NetworkManager[55506]: <info>  [1769841086.3846] device (tap349f85d7-90): carrier: link connected
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.388 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[05f7ec3d-81b7-4589-ab31-5f400280130a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.405 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[8c344478-d8a0-4bd7-86d7-5c55418ae1ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap349f85d7-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ca:96:78'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 339768, 'reachable_time': 20380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211865, 'error': None, 'target': 'ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.421 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[f361224a-db94-4e0c-a03e-6808b5070464]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feca:9678'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 339768, 'tstamp': 339768}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211866, 'error': None, 'target': 'ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.446 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[dba532f0-a970-489d-b898-f67329b357c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap349f85d7-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ca:96:78'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 339768, 'reachable_time': 20380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211867, 'error': None, 'target': 'ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.479 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[542a147d-ee21-42bf-91ca-0bec9f8e2e75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.537 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[54f6a338-5a69-4986-b52b-9f31db75c995]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.540 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap349f85d7-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.541 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.541 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap349f85d7-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:31:26 np0005603500 nova_compute[182934]: 2026-01-31 06:31:26.544 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:26 np0005603500 NetworkManager[55506]: <info>  [1769841086.5449] manager: (tap349f85d7-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Jan 31 01:31:26 np0005603500 kernel: tap349f85d7-90: entered promiscuous mode
Jan 31 01:31:26 np0005603500 nova_compute[182934]: 2026-01-31 06:31:26.547 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.549 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap349f85d7-90, col_values=(('external_ids', {'iface-id': 'bbc7e9d4-0e42-4731-a0a1-d6912b3f33d9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:31:26 np0005603500 nova_compute[182934]: 2026-01-31 06:31:26.550 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:26 np0005603500 ovn_controller[95398]: 2026-01-31T06:31:26Z|00044|binding|INFO|Releasing lport bbc7e9d4-0e42-4731-a0a1-d6912b3f33d9 from this chassis (sb_readonly=0)
Jan 31 01:31:26 np0005603500 nova_compute[182934]: 2026-01-31 06:31:26.550 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:26 np0005603500 nova_compute[182934]: 2026-01-31 06:31:26.554 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.553 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[a74c8c07-f9a7-471b-bfe3-045a2c2aa0e6]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.555 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.556 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.556 104644 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63 disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.557 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.557 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[8edf3311-d369-452a-bbdc-202ff8d7b4af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.558 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.559 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[846a0130-0d45-4413-a55f-f8e81d8ca35f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.559 104644 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: global
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    log         /dev/log local0 debug
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    log-tag     haproxy-metadata-proxy-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    user        root
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    group       root
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    maxconn     1024
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    pidfile     /var/lib/neutron/external/pids/349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63.pid.haproxy
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    daemon
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: defaults
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    log global
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    mode http
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    option httplog
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    option dontlognull
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    option http-server-close
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    option forwardfor
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    retries                 3
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    timeout http-request    30s
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    timeout connect         30s
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    timeout client          32s
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    timeout server          32s
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    timeout http-keep-alive 30s
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: listen listener
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    bind 169.254.169.254:80
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]:    http-request add-header X-OVN-Network-ID 349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 31 01:31:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:26.561 104644 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63', 'env', 'PROCESS_TAG=haproxy-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Jan 31 01:31:27 np0005603500 nova_compute[182934]: 2026-01-31 06:31:27.020 182938 DEBUG nova.compute.manager [req-3c98dbd5-0118-460b-b759-dd5621839ed0 req-8d32b680-ddf0-40a3-8766-7654ee25bc53 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Received event network-vif-plugged-57786dbf-8d90-4cbe-834d-3cd072a75d1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:31:27 np0005603500 nova_compute[182934]: 2026-01-31 06:31:27.021 182938 DEBUG oslo_concurrency.lockutils [req-3c98dbd5-0118-460b-b759-dd5621839ed0 req-8d32b680-ddf0-40a3-8766-7654ee25bc53 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "7dedc0e6-e769-4fda-b465-152126c73743-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:31:27 np0005603500 nova_compute[182934]: 2026-01-31 06:31:27.021 182938 DEBUG oslo_concurrency.lockutils [req-3c98dbd5-0118-460b-b759-dd5621839ed0 req-8d32b680-ddf0-40a3-8766-7654ee25bc53 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "7dedc0e6-e769-4fda-b465-152126c73743-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:31:27 np0005603500 nova_compute[182934]: 2026-01-31 06:31:27.021 182938 DEBUG oslo_concurrency.lockutils [req-3c98dbd5-0118-460b-b759-dd5621839ed0 req-8d32b680-ddf0-40a3-8766-7654ee25bc53 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "7dedc0e6-e769-4fda-b465-152126c73743-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:31:27 np0005603500 nova_compute[182934]: 2026-01-31 06:31:27.022 182938 DEBUG nova.compute.manager [req-3c98dbd5-0118-460b-b759-dd5621839ed0 req-8d32b680-ddf0-40a3-8766-7654ee25bc53 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Processing event network-vif-plugged-57786dbf-8d90-4cbe-834d-3cd072a75d1f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Jan 31 01:31:27 np0005603500 nova_compute[182934]: 2026-01-31 06:31:27.022 182938 DEBUG nova.compute.manager [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Jan 31 01:31:27 np0005603500 nova_compute[182934]: 2026-01-31 06:31:27.034 182938 DEBUG nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Jan 31 01:31:27 np0005603500 nova_compute[182934]: 2026-01-31 06:31:27.037 182938 INFO nova.virt.libvirt.driver [-] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Instance spawned successfully.
Jan 31 01:31:27 np0005603500 nova_compute[182934]: 2026-01-31 06:31:27.038 182938 DEBUG nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Jan 31 01:31:27 np0005603500 podman[211900]: 2026-01-31 06:31:27.006550676 +0000 UTC m=+0.039997544 image pull d52ce0b189025039ce86fc9564595bcce243e95c598f912f021ea09cd4116a16 quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:31:27 np0005603500 podman[211900]: 2026-01-31 06:31:27.175063011 +0000 UTC m=+0.208509869 container create fc52a7650b4b2873053b5c5a041c20128aff84ca701aab86939f53243ec18ca7 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3)
Jan 31 01:31:27 np0005603500 systemd[1]: Started libpod-conmon-fc52a7650b4b2873053b5c5a041c20128aff84ca701aab86939f53243ec18ca7.scope.
Jan 31 01:31:27 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:31:27 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e7e3c36394e0facde9cd6b4e232224ee7933535e67e7034a5664ad4d2127baea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 01:31:27 np0005603500 podman[211900]: 2026-01-31 06:31:27.41980595 +0000 UTC m=+0.453252828 container init fc52a7650b4b2873053b5c5a041c20128aff84ca701aab86939f53243ec18ca7 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:31:27 np0005603500 podman[211900]: 2026-01-31 06:31:27.435262966 +0000 UTC m=+0.468709844 container start fc52a7650b4b2873053b5c5a041c20128aff84ca701aab86939f53243ec18ca7 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:31:27 np0005603500 neutron-haproxy-ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63[211915]: [NOTICE]   (211919) : New worker (211921) forked
Jan 31 01:31:27 np0005603500 neutron-haproxy-ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63[211915]: [NOTICE]   (211919) : Loading success.
Jan 31 01:31:27 np0005603500 nova_compute[182934]: 2026-01-31 06:31:27.533 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Instance 7dedc0e6-e769-4fda-b465-152126c73743 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Jan 31 01:31:27 np0005603500 nova_compute[182934]: 2026-01-31 06:31:27.534 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:31:27 np0005603500 nova_compute[182934]: 2026-01-31 06:31:27.534 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:31:27 np0005603500 nova_compute[182934]: 2026-01-31 06:31:27.584 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Updating inventory in ProviderTree for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 31 01:31:27 np0005603500 nova_compute[182934]: 2026-01-31 06:31:27.629 182938 DEBUG nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:31:27 np0005603500 nova_compute[182934]: 2026-01-31 06:31:27.630 182938 DEBUG nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:31:27 np0005603500 nova_compute[182934]: 2026-01-31 06:31:27.630 182938 DEBUG nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:31:27 np0005603500 nova_compute[182934]: 2026-01-31 06:31:27.631 182938 DEBUG nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:31:27 np0005603500 nova_compute[182934]: 2026-01-31 06:31:27.631 182938 DEBUG nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:31:27 np0005603500 nova_compute[182934]: 2026-01-31 06:31:27.632 182938 DEBUG nova.virt.libvirt.driver [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:31:27 np0005603500 nova_compute[182934]: 2026-01-31 06:31:27.722 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:28 np0005603500 nova_compute[182934]: 2026-01-31 06:31:28.157 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Updated inventory for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:975
Jan 31 01:31:28 np0005603500 nova_compute[182934]: 2026-01-31 06:31:28.157 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Updating resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 31 01:31:28 np0005603500 nova_compute[182934]: 2026-01-31 06:31:28.158 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Updating inventory in ProviderTree for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 31 01:31:28 np0005603500 nova_compute[182934]: 2026-01-31 06:31:28.355 182938 INFO nova.compute.manager [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Took 25.82 seconds to spawn the instance on the hypervisor.
Jan 31 01:31:28 np0005603500 nova_compute[182934]: 2026-01-31 06:31:28.356 182938 DEBUG nova.compute.manager [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Jan 31 01:31:28 np0005603500 nova_compute[182934]: 2026-01-31 06:31:28.708 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:31:28 np0005603500 nova_compute[182934]: 2026-01-31 06:31:28.709 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.447s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:31:29 np0005603500 nova_compute[182934]: 2026-01-31 06:31:29.212 182938 INFO nova.compute.manager [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Took 32.52 seconds to build instance.
Jan 31 01:31:29 np0005603500 nova_compute[182934]: 2026-01-31 06:31:29.303 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:29 np0005603500 nova_compute[182934]: 2026-01-31 06:31:29.421 182938 DEBUG nova.compute.manager [req-097f1f00-b543-4d27-926a-e96f4987489c req-1260c340-7879-4c61-a64b-16f599328b61 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Received event network-vif-plugged-57786dbf-8d90-4cbe-834d-3cd072a75d1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:31:29 np0005603500 nova_compute[182934]: 2026-01-31 06:31:29.421 182938 DEBUG oslo_concurrency.lockutils [req-097f1f00-b543-4d27-926a-e96f4987489c req-1260c340-7879-4c61-a64b-16f599328b61 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "7dedc0e6-e769-4fda-b465-152126c73743-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:31:29 np0005603500 nova_compute[182934]: 2026-01-31 06:31:29.422 182938 DEBUG oslo_concurrency.lockutils [req-097f1f00-b543-4d27-926a-e96f4987489c req-1260c340-7879-4c61-a64b-16f599328b61 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "7dedc0e6-e769-4fda-b465-152126c73743-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:31:29 np0005603500 nova_compute[182934]: 2026-01-31 06:31:29.422 182938 DEBUG oslo_concurrency.lockutils [req-097f1f00-b543-4d27-926a-e96f4987489c req-1260c340-7879-4c61-a64b-16f599328b61 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "7dedc0e6-e769-4fda-b465-152126c73743-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:31:29 np0005603500 nova_compute[182934]: 2026-01-31 06:31:29.422 182938 DEBUG nova.compute.manager [req-097f1f00-b543-4d27-926a-e96f4987489c req-1260c340-7879-4c61-a64b-16f599328b61 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] No waiting events found dispatching network-vif-plugged-57786dbf-8d90-4cbe-834d-3cd072a75d1f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:31:29 np0005603500 nova_compute[182934]: 2026-01-31 06:31:29.422 182938 WARNING nova.compute.manager [req-097f1f00-b543-4d27-926a-e96f4987489c req-1260c340-7879-4c61-a64b-16f599328b61 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Received unexpected event network-vif-plugged-57786dbf-8d90-4cbe-834d-3cd072a75d1f for instance with vm_state active and task_state None.
Jan 31 01:31:29 np0005603500 nova_compute[182934]: 2026-01-31 06:31:29.678 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:31:29 np0005603500 nova_compute[182934]: 2026-01-31 06:31:29.679 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:31:29 np0005603500 nova_compute[182934]: 2026-01-31 06:31:29.757 182938 DEBUG oslo_concurrency.lockutils [None req-d0c8c1c7-aceb-4bfb-8a32-2daaed0d7616 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "7dedc0e6-e769-4fda-b465-152126c73743" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 34.751s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:31:30 np0005603500 nova_compute[182934]: 2026-01-31 06:31:30.291 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:31:30 np0005603500 nova_compute[182934]: 2026-01-31 06:31:30.292 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:31:30 np0005603500 nova_compute[182934]: 2026-01-31 06:31:30.292 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:31:30 np0005603500 nova_compute[182934]: 2026-01-31 06:31:30.293 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:31:30 np0005603500 nova_compute[182934]: 2026-01-31 06:31:30.293 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:31:30 np0005603500 nova_compute[182934]: 2026-01-31 06:31:30.293 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:31:32 np0005603500 nova_compute[182934]: 2026-01-31 06:31:32.723 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:33.137 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:31:34 np0005603500 nova_compute[182934]: 2026-01-31 06:31:34.305 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:37 np0005603500 NetworkManager[55506]: <info>  [1769841097.6313] manager: (patch-br-int-to-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/23)
Jan 31 01:31:37 np0005603500 ovn_controller[95398]: 2026-01-31T06:31:37Z|00045|binding|INFO|Releasing lport bbc7e9d4-0e42-4731-a0a1-d6912b3f33d9 from this chassis (sb_readonly=0)
Jan 31 01:31:37 np0005603500 NetworkManager[55506]: <info>  [1769841097.6361] device (patch-br-int-to-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 01:31:37 np0005603500 NetworkManager[55506]: <warn>  [1769841097.6364] device (patch-br-int-to-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 01:31:37 np0005603500 NetworkManager[55506]: <info>  [1769841097.6385] manager: (patch-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/24)
Jan 31 01:31:37 np0005603500 NetworkManager[55506]: <info>  [1769841097.6393] device (patch-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 01:31:37 np0005603500 NetworkManager[55506]: <warn>  [1769841097.6394] device (patch-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 01:31:37 np0005603500 NetworkManager[55506]: <info>  [1769841097.6407] manager: (patch-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Jan 31 01:31:37 np0005603500 nova_compute[182934]: 2026-01-31 06:31:37.638 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:37 np0005603500 NetworkManager[55506]: <info>  [1769841097.6417] manager: (patch-br-int-to-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Jan 31 01:31:37 np0005603500 NetworkManager[55506]: <info>  [1769841097.6424] device (patch-br-int-to-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 31 01:31:37 np0005603500 NetworkManager[55506]: <info>  [1769841097.6433] device (patch-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 31 01:31:37 np0005603500 ovn_controller[95398]: 2026-01-31T06:31:37Z|00046|binding|INFO|Releasing lport bbc7e9d4-0e42-4731-a0a1-d6912b3f33d9 from this chassis (sb_readonly=0)
Jan 31 01:31:37 np0005603500 nova_compute[182934]: 2026-01-31 06:31:37.661 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:37 np0005603500 nova_compute[182934]: 2026-01-31 06:31:37.725 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:38 np0005603500 podman[211932]: 2026-01-31 06:31:38.162662855 +0000 UTC m=+0.071708170 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 31 01:31:38 np0005603500 nova_compute[182934]: 2026-01-31 06:31:38.630 182938 DEBUG nova.compute.manager [req-888be602-8e9a-476d-a122-462684913d07 req-7878acfd-a6d6-4194-843c-59281c38c140 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Received event network-changed-57786dbf-8d90-4cbe-834d-3cd072a75d1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:31:38 np0005603500 nova_compute[182934]: 2026-01-31 06:31:38.631 182938 DEBUG nova.compute.manager [req-888be602-8e9a-476d-a122-462684913d07 req-7878acfd-a6d6-4194-843c-59281c38c140 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Refreshing instance network info cache due to event network-changed-57786dbf-8d90-4cbe-834d-3cd072a75d1f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:31:38 np0005603500 nova_compute[182934]: 2026-01-31 06:31:38.631 182938 DEBUG oslo_concurrency.lockutils [req-888be602-8e9a-476d-a122-462684913d07 req-7878acfd-a6d6-4194-843c-59281c38c140 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-7dedc0e6-e769-4fda-b465-152126c73743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:31:38 np0005603500 nova_compute[182934]: 2026-01-31 06:31:38.631 182938 DEBUG oslo_concurrency.lockutils [req-888be602-8e9a-476d-a122-462684913d07 req-7878acfd-a6d6-4194-843c-59281c38c140 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-7dedc0e6-e769-4fda-b465-152126c73743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:31:38 np0005603500 nova_compute[182934]: 2026-01-31 06:31:38.631 182938 DEBUG nova.network.neutron [req-888be602-8e9a-476d-a122-462684913d07 req-7878acfd-a6d6-4194-843c-59281c38c140 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Refreshing network info cache for port 57786dbf-8d90-4cbe-834d-3cd072a75d1f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:31:39 np0005603500 nova_compute[182934]: 2026-01-31 06:31:39.308 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:40 np0005603500 podman[211958]: 2026-01-31 06:31:40.138408621 +0000 UTC m=+0.056960587 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:31:42 np0005603500 nova_compute[182934]: 2026-01-31 06:31:42.727 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:43 np0005603500 nova_compute[182934]: 2026-01-31 06:31:43.966 182938 DEBUG nova.network.neutron [req-888be602-8e9a-476d-a122-462684913d07 req-7878acfd-a6d6-4194-843c-59281c38c140 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Updated VIF entry in instance network info cache for port 57786dbf-8d90-4cbe-834d-3cd072a75d1f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:31:43 np0005603500 nova_compute[182934]: 2026-01-31 06:31:43.967 182938 DEBUG nova.network.neutron [req-888be602-8e9a-476d-a122-462684913d07 req-7878acfd-a6d6-4194-843c-59281c38c140 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Updating instance_info_cache with network_info: [{"id": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "address": "fa:16:3e:dc:ad:4e", "network": {"id": "349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63", "bridge": "br-int", "label": "tempest-network-smoke--1435648712", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57786dbf-8d", "ovs_interfaceid": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:31:44 np0005603500 nova_compute[182934]: 2026-01-31 06:31:44.309 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:44 np0005603500 ovn_controller[95398]: 2026-01-31T06:31:44Z|00003|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:dc:ad:4e 10.100.0.14
Jan 31 01:31:44 np0005603500 ovn_controller[95398]: 2026-01-31T06:31:44Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:dc:ad:4e 10.100.0.14
Jan 31 01:31:44 np0005603500 nova_compute[182934]: 2026-01-31 06:31:44.608 182938 DEBUG oslo_concurrency.lockutils [req-888be602-8e9a-476d-a122-462684913d07 req-7878acfd-a6d6-4194-843c-59281c38c140 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-7dedc0e6-e769-4fda-b465-152126c73743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:31:46 np0005603500 podman[211994]: 2026-01-31 06:31:46.147674201 +0000 UTC m=+0.055837522 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, release=1769056855, version=9.7, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible)
Jan 31 01:31:47 np0005603500 nova_compute[182934]: 2026-01-31 06:31:47.730 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:49 np0005603500 nova_compute[182934]: 2026-01-31 06:31:49.310 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:50 np0005603500 podman[212016]: 2026-01-31 06:31:50.194458188 +0000 UTC m=+0.110211145 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 31 01:31:51 np0005603500 nova_compute[182934]: 2026-01-31 06:31:51.351 182938 INFO nova.compute.manager [None req-b5051840-8f19-4dc5-9294-4dd720727a67 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Get console output
Jan 31 01:31:51 np0005603500 nova_compute[182934]: 2026-01-31 06:31:51.473 211654 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 31 01:31:52 np0005603500 nova_compute[182934]: 2026-01-31 06:31:52.731 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:54 np0005603500 nova_compute[182934]: 2026-01-31 06:31:54.311 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:55 np0005603500 podman[212043]: 2026-01-31 06:31:55.171344247 +0000 UTC m=+0.083582502 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 01:31:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:55.539 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:31:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:55.539 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:31:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:31:55.540 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:31:57 np0005603500 podman[212066]: 2026-01-31 06:31:57.130366678 +0000 UTC m=+0.047625589 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:31:57 np0005603500 nova_compute[182934]: 2026-01-31 06:31:57.768 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:31:59 np0005603500 nova_compute[182934]: 2026-01-31 06:31:59.313 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:01 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:01.103 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:8c:df 10.100.0.18'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': 'ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8ee2d8dd-2830-4df2-a30b-d14056986193', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ea3e50b-96a6-4b10-8620-0c365db289f7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=819f4165-cac3-47d8-a691-9a70a2b6604f) old=Port_Binding(mac=['fa:16:3e:83:8c:df'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8ee2d8dd-2830-4df2-a30b-d14056986193', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:32:01 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:01.105 104644 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 819f4165-cac3-47d8-a691-9a70a2b6604f in datapath 8ee2d8dd-2830-4df2-a30b-d14056986193 updated
Jan 31 01:32:01 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:01.106 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8ee2d8dd-2830-4df2-a30b-d14056986193, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:32:01 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:01.107 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[f51431a1-c411-49a5-a118-cdbf177b4709]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:32:02 np0005603500 nova_compute[182934]: 2026-01-31 06:32:02.768 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:04 np0005603500 nova_compute[182934]: 2026-01-31 06:32:04.314 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:07 np0005603500 nova_compute[182934]: 2026-01-31 06:32:07.770 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:09 np0005603500 podman[212090]: 2026-01-31 06:32:09.166351384 +0000 UTC m=+0.087635736 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 31 01:32:09 np0005603500 nova_compute[182934]: 2026-01-31 06:32:09.316 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:11 np0005603500 podman[212108]: 2026-01-31 06:32:11.126647363 +0000 UTC m=+0.047830212 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 31 01:32:12 np0005603500 nova_compute[182934]: 2026-01-31 06:32:12.771 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:14 np0005603500 nova_compute[182934]: 2026-01-31 06:32:14.318 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:17 np0005603500 podman[212132]: 2026-01-31 06:32:17.118348597 +0000 UTC m=+0.042441800 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.7, architecture=x86_64, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Jan 31 01:32:17 np0005603500 nova_compute[182934]: 2026-01-31 06:32:17.774 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:17.981 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f199f43bca0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:18 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:18.460 16 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/servers/7dedc0e6-e769-4fda-b465-152126c73743 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}9de33c3c4c813c7413c734743528a34030291a616c281269e5092e293b0fad44" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:580
Jan 31 01:32:19 np0005603500 nova_compute[182934]: 2026-01-31 06:32:19.320 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.580 16 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1974 Content-Type: application/json Date: Sat, 31 Jan 2026 06:32:18 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-857ab093-9c89-4fd2-97f9-da89504e8413 x-openstack-request-id: req-857ab093-9c89-4fd2-97f9-da89504e8413 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:621
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.581 16 DEBUG novaclient.v2.client [-] RESP BODY: {"server": {"id": "7dedc0e6-e769-4fda-b465-152126c73743", "name": "tempest-TestNetworkBasicOps-server-132386650", "status": "ACTIVE", "tenant_id": "829310cd8381494e96216dba067ff8d3", "user_id": "dddc34b0385a49a5bd9bf081ed29e9fd", "metadata": {}, "hostId": "0c6cfdf0627941602de15b61ce73eab761f53a1d9b2a5d92c8bbcc8e", "image": {"id": "9f613975-b701-42a0-9b35-7d5c4a2cb7f2", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/images/9f613975-b701-42a0-9b35-7d5c4a2cb7f2"}]}, "flavor": {"id": "9956992e-a3ca-497f-9747-3ae270e07def", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/9956992e-a3ca-497f-9747-3ae270e07def"}]}, "created": "2026-01-31T06:30:50Z", "updated": "2026-01-31T06:31:28Z", "addresses": {"tempest-network-smoke--1435648712": [{"version": 4, "addr": "10.100.0.14", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:dc:ad:4e"}, {"version": 4, "addr": "192.168.122.210", "OS-EXT-IPS:type": "floating", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:dc:ad:4e"}]}, "accessIPv4": "", "accessIPv6": "", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/servers/7dedc0e6-e769-4fda-b465-152126c73743"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/servers/7dedc0e6-e769-4fda-b465-152126c73743"}], "OS-DCF:diskConfig": "MANUAL", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "True", "key_name": "tempest-TestNetworkBasicOps-78724422", "OS-SRV-USG:launched_at": "2026-01-31T06:31:28.000000", "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "tempest-secgroup-smoke-1717074477"}], "OS-EXT-SRV-ATTR:host": "compute-0.ctlplane.example.com", "OS-EXT-SRV-ATTR:instance_name": "instance-00000001", "OS-EXT-SRV-ATTR:hypervisor_hostname": "compute-0.ctlplane.example.com", "OS-EXT-STS:task_state": null, "OS-EXT-STS:vm_state": "active", "OS-EXT-STS:power_state": 1, "os-extended-volumes:volumes_attached": []}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:656
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.581 16 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/servers/7dedc0e6-e769-4fda-b465-152126c73743 used request id req-857ab093-9c89-4fd2-97f9-da89504e8413 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:1081
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.583 16 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7dedc0e6-e769-4fda-b465-152126c73743', 'name': 'tempest-TestNetworkBasicOps-server-132386650', 'flavor': {'id': '9956992e-a3ca-497f-9747-3ae270e07def', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9f613975-b701-42a0-9b35-7d5c4a2cb7f2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '829310cd8381494e96216dba067ff8d3', 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'hostId': '0c6cfdf0627941602de15b61ce73eab761f53a1d9b2a5d92c8bbcc8e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:226
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.583 16 INFO ceilometer.polling.manager [-] Polling pollster disk.root.size in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.583 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43bbb0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.584 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43bbb0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.584 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.root.size heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.586 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.root.size (2026-01-31T06:32:20.584348) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.586 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.root.size in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.586 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f199f44dc10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.587 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.587 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44dca0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.587 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44dca0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.587 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.rate heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.587 16 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.588 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.rate (2026-01-31T06:32:20.587590) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.588 16 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-132386650>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-132386650>]
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.588 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f199f43baf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.589 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.589 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b400>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.589 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b400>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.589 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.requests heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.589 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.requests (2026-01-31T06:32:20.589278) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.619 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/disk.device.write.requests volume: 322 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.619 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.620 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.requests in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.620 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f199f44d3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.620 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.621 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d5e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.621 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d5e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.621 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.622 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes (2026-01-31T06:32:20.621305) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.624 16 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7dedc0e6-e769-4fda-b465-152126c73743 / tap57786dbf-8d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.625 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/network.incoming.bytes volume: 19674 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.625 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.626 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f199f44d6a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.626 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.626 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d850>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.626 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d850>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.626 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.rate heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.626 16 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.627 16 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-132386650>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-132386650>]
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.627 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f199f451250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.627 16 INFO ceilometer.polling.manager [-] Polling pollster power.state in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.627 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.rate (2026-01-31T06:32:20.626596) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.627 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f4512e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.627 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f4512e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.627 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: power.state heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.628 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for power.state (2026-01-31T06:32:20.627855) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.641 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/power.state volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.642 16 INFO ceilometer.polling.manager [-] Finished polling pollster power.state in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.642 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f199f43bbe0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.642 16 INFO ceilometer.polling.manager [-] Polling pollster disk.ephemeral.size in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.642 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b2e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.643 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b2e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.643 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.ephemeral.size (2026-01-31T06:32:20.643197) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.643 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.ephemeral.size heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.644 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.ephemeral.size in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.644 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f199f43b700>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.644 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.645 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b610>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.645 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b610>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.645 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.bytes (2026-01-31T06:32:20.645469) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.645 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.646 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/disk.device.read.bytes volume: 30824960 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.646 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/disk.device.read.bytes volume: 284990 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.647 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.bytes in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.647 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f199f43b0d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.647 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.648 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b6d0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.648 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b6d0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.648 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.latency (2026-01-31T06:32:20.648445) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.648 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.latency heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.649 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/disk.device.read.latency volume: 1999570256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.649 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/disk.device.read.latency volume: 168750058 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.650 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.latency in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.650 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f199f44d2e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.650 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.651 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d3a0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.651 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d3a0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.651 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.drop (2026-01-31T06:32:20.651442) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.651 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.drop heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.652 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.652 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.652 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f199f44d220>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.653 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.653 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d130>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.653 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d130>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.654 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes (2026-01-31T06:32:20.653849) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.653 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.654 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/network.outgoing.bytes volume: 16166 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.655 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.655 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f199f44d4f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.655 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.655 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d580>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.656 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d580>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.656 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.error heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.656 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.error (2026-01-31T06:32:20.656182) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.656 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.657 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.error in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.657 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f199f43b3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.658 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.658 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b460>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.658 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b460>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.658 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.requests (2026-01-31T06:32:20.658622) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.658 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.requests heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.659 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/disk.device.read.requests volume: 1102 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.659 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/disk.device.read.requests volume: 113 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.660 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.requests in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.660 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f199f43bdf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.660 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.661 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43be80>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.661 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43be80>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.661 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.latency (2026-01-31T06:32:20.661451) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.661 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.latency heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.662 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/disk.device.write.latency volume: 86437404844 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.662 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.663 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.latency in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.663 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f199f44d160>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.663 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.664 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d0a0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.664 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d0a0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.664 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.error (2026-01-31T06:32:20.664545) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.664 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.error heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.665 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.665 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.error in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.666 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f199f44d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.666 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.666 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d7f0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.666 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d7f0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.667 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets (2026-01-31T06:32:20.666932) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.667 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.667 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/network.outgoing.packets volume: 111 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.668 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.668 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f199f436bb0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.668 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.668 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43ba60>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.669 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43ba60>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.669 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.allocation heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.669 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.allocation (2026-01-31T06:32:20.669245) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.682 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.683 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/disk.device.allocation volume: 499712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.683 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.allocation in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.684 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f199f44db50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.684 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.684 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44dbe0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.684 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44dbe0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.685 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.delta heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.685 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.delta (2026-01-31T06:32:20.685034) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.685 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.686 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.686 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f199f43b550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.686 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.686 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b5e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.686 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b5e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.686 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.687 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/disk.device.write.bytes volume: 72982528 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.686 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.bytes (2026-01-31T06:32:20.686736) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.687 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.687 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.bytes in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.687 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f199f44dcd0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.688 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.688 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44dd60>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.688 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44dd60>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.688 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.688 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/network.incoming.packets volume: 102 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.688 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets (2026-01-31T06:32:20.688444) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.689 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.689 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f199f43b340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.689 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.689 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b9d0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.689 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b9d0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.689 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.capacity heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.690 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.690 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/disk.device.capacity volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.690 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.capacity in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.691 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f19a53f3b80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.691 16 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.691 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f436ee0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.691 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f436ee0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.691 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: cpu heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.691 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/cpu volume: 11690000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.692 16 INFO ceilometer.polling.manager [-] Finished polling pollster cpu in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.692 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f199f44d760>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.692 16 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.692 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d400>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.692 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d400>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.692 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: memory.usage heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.692 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/memory.usage volume: 41.98046875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.693 16 INFO ceilometer.polling.manager [-] Finished polling pollster memory.usage in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.693 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f199f44d940>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.693 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.693 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d1f0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.694 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d1f0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.694 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.delta heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.694 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.694 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.694 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f199f44d040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.694 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.694 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44de20>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.695 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44de20>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.695 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.drop heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.695 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.695 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.drop in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.695 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f199f43b490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.695 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.695 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b520>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.695 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b520>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.696 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.usage heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.696 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.696 16 DEBUG ceilometer.compute.pollsters [-] 7dedc0e6-e769-4fda-b465-152126c73743/disk.device.usage volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.696 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.usage in the context of pollsters
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.697 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.capacity (2026-01-31T06:32:20.689898) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.697 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for cpu (2026-01-31T06:32:20.691515) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.697 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for memory.usage (2026-01-31T06:32:20.692815) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.698 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.delta (2026-01-31T06:32:20.694102) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.698 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.drop (2026-01-31T06:32:20.695120) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:20 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:32:20.698 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.usage (2026-01-31T06:32:20.696070) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:32:21 np0005603500 podman[212155]: 2026-01-31 06:32:21.15397993 +0000 UTC m=+0.073736815 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 31 01:32:21 np0005603500 nova_compute[182934]: 2026-01-31 06:32:21.755 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:32:21 np0005603500 nova_compute[182934]: 2026-01-31 06:32:21.755 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:32:22 np0005603500 nova_compute[182934]: 2026-01-31 06:32:22.390 182938 DEBUG nova.compute.manager [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Jan 31 01:32:22 np0005603500 nova_compute[182934]: 2026-01-31 06:32:22.778 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:22 np0005603500 nova_compute[182934]: 2026-01-31 06:32:22.967 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:32:22 np0005603500 nova_compute[182934]: 2026-01-31 06:32:22.968 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:32:22 np0005603500 nova_compute[182934]: 2026-01-31 06:32:22.976 182938 DEBUG nova.virt.hardware [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Jan 31 01:32:22 np0005603500 nova_compute[182934]: 2026-01-31 06:32:22.976 182938 INFO nova.compute.claims [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Claim successful on node compute-0.ctlplane.example.com
Jan 31 01:32:24 np0005603500 nova_compute[182934]: 2026-01-31 06:32:24.095 182938 DEBUG nova.compute.provider_tree [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:32:24 np0005603500 nova_compute[182934]: 2026-01-31 06:32:24.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:32:24 np0005603500 nova_compute[182934]: 2026-01-31 06:32:24.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:32:24 np0005603500 nova_compute[182934]: 2026-01-31 06:32:24.322 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:24 np0005603500 nova_compute[182934]: 2026-01-31 06:32:24.697 182938 DEBUG nova.scheduler.client.report [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:32:24 np0005603500 nova_compute[182934]: 2026-01-31 06:32:24.771 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:32:25 np0005603500 nova_compute[182934]: 2026-01-31 06:32:25.241 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:32:25 np0005603500 nova_compute[182934]: 2026-01-31 06:32:25.242 182938 DEBUG nova.compute.manager [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Jan 31 01:32:25 np0005603500 nova_compute[182934]: 2026-01-31 06:32:25.244 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.473s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:32:25 np0005603500 nova_compute[182934]: 2026-01-31 06:32:25.245 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:32:25 np0005603500 nova_compute[182934]: 2026-01-31 06:32:25.245 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:32:25 np0005603500 nova_compute[182934]: 2026-01-31 06:32:25.814 182938 DEBUG nova.compute.manager [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Jan 31 01:32:25 np0005603500 nova_compute[182934]: 2026-01-31 06:32:25.815 182938 DEBUG nova.network.neutron [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Jan 31 01:32:26 np0005603500 podman[212181]: 2026-01-31 06:32:26.125592911 +0000 UTC m=+0.047927394 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.build-date=20260127)
Jan 31 01:32:26 np0005603500 nova_compute[182934]: 2026-01-31 06:32:26.261 182938 DEBUG nova.policy [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '829310cd8381494e96216dba067ff8d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Jan 31 01:32:26 np0005603500 nova_compute[182934]: 2026-01-31 06:32:26.500 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:32:26 np0005603500 nova_compute[182934]: 2026-01-31 06:32:26.552 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:32:26 np0005603500 nova_compute[182934]: 2026-01-31 06:32:26.553 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:32:26 np0005603500 nova_compute[182934]: 2026-01-31 06:32:26.603 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:32:26 np0005603500 ovn_controller[95398]: 2026-01-31T06:32:26Z|00047|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Jan 31 01:32:26 np0005603500 nova_compute[182934]: 2026-01-31 06:32:26.714 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:32:26 np0005603500 nova_compute[182934]: 2026-01-31 06:32:26.715 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5630MB free_disk=73.18732070922852GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:32:26 np0005603500 nova_compute[182934]: 2026-01-31 06:32:26.716 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:32:26 np0005603500 nova_compute[182934]: 2026-01-31 06:32:26.716 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:32:26 np0005603500 nova_compute[182934]: 2026-01-31 06:32:26.907 182938 INFO nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 31 01:32:27 np0005603500 nova_compute[182934]: 2026-01-31 06:32:27.422 182938 DEBUG nova.compute.manager [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Jan 31 01:32:27 np0005603500 nova_compute[182934]: 2026-01-31 06:32:27.778 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:27 np0005603500 nova_compute[182934]: 2026-01-31 06:32:27.817 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Instance 7dedc0e6-e769-4fda-b465-152126c73743 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Jan 31 01:32:27 np0005603500 nova_compute[182934]: 2026-01-31 06:32:27.817 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Instance c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Jan 31 01:32:27 np0005603500 nova_compute[182934]: 2026-01-31 06:32:27.817 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:32:27 np0005603500 nova_compute[182934]: 2026-01-31 06:32:27.818 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:32:27 np0005603500 nova_compute[182934]: 2026-01-31 06:32:27.883 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:32:28 np0005603500 podman[212208]: 2026-01-31 06:32:28.118595379 +0000 UTC m=+0.039177996 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 31 01:32:28 np0005603500 nova_compute[182934]: 2026-01-31 06:32:28.483 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.045 182938 DEBUG nova.compute.manager [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.047 182938 DEBUG nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.048 182938 INFO nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Creating image(s)
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.049 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "/var/lib/nova/instances/c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.050 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.051 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.052 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.060 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.062 182938 DEBUG oslo_concurrency.processutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.105 182938 DEBUG oslo_concurrency.processutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.106 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "d9035e96dc857b84194c2a2b496d294827e2de39" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.107 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.108 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.111 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.112 182938 DEBUG oslo_concurrency.processutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.147 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.148 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.432s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.165 182938 DEBUG oslo_concurrency.processutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.166 182938 DEBUG oslo_concurrency.processutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.202 182938 DEBUG oslo_concurrency.processutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.203 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.203 182938 DEBUG oslo_concurrency.processutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.245 182938 DEBUG oslo_concurrency.processutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.245 182938 DEBUG nova.virt.disk.api [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Checking if we can resize image /var/lib/nova/instances/c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.246 182938 DEBUG oslo_concurrency.processutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.286 182938 DEBUG oslo_concurrency.processutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e/disk --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.287 182938 DEBUG nova.virt.disk.api [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Cannot resize image /var/lib/nova/instances/c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.287 182938 DEBUG nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.287 182938 DEBUG nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Ensure instance console log exists: /var/lib/nova/instances/c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.288 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.288 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.288 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.324 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:29 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:29.934 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:32:29 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:29.935 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:32:29 np0005603500 nova_compute[182934]: 2026-01-31 06:32:29.935 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:30 np0005603500 nova_compute[182934]: 2026-01-31 06:32:30.121 182938 DEBUG nova.network.neutron [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Successfully created port: 09341075-d930-414d-afa9-4ab4a986923d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 31 01:32:30 np0005603500 nova_compute[182934]: 2026-01-31 06:32:30.148 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:32:30 np0005603500 nova_compute[182934]: 2026-01-31 06:32:30.149 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:32:30 np0005603500 nova_compute[182934]: 2026-01-31 06:32:30.149 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:32:30 np0005603500 nova_compute[182934]: 2026-01-31 06:32:30.149 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:32:30 np0005603500 nova_compute[182934]: 2026-01-31 06:32:30.149 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:32:30 np0005603500 nova_compute[182934]: 2026-01-31 06:32:30.150 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:32:31 np0005603500 nova_compute[182934]: 2026-01-31 06:32:31.148 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:32:31 np0005603500 nova_compute[182934]: 2026-01-31 06:32:31.257 182938 DEBUG nova.network.neutron [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Successfully updated port: 09341075-d930-414d-afa9-4ab4a986923d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 31 01:32:31 np0005603500 nova_compute[182934]: 2026-01-31 06:32:31.553 182938 DEBUG nova.compute.manager [req-34e38383-8c82-4143-8be7-f57f9a080255 req-123dcfa5-7a80-43a5-a796-ed0c098d11f3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Received event network-changed-09341075-d930-414d-afa9-4ab4a986923d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:32:31 np0005603500 nova_compute[182934]: 2026-01-31 06:32:31.554 182938 DEBUG nova.compute.manager [req-34e38383-8c82-4143-8be7-f57f9a080255 req-123dcfa5-7a80-43a5-a796-ed0c098d11f3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Refreshing instance network info cache due to event network-changed-09341075-d930-414d-afa9-4ab4a986923d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:32:31 np0005603500 nova_compute[182934]: 2026-01-31 06:32:31.554 182938 DEBUG oslo_concurrency.lockutils [req-34e38383-8c82-4143-8be7-f57f9a080255 req-123dcfa5-7a80-43a5-a796-ed0c098d11f3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:32:31 np0005603500 nova_compute[182934]: 2026-01-31 06:32:31.554 182938 DEBUG oslo_concurrency.lockutils [req-34e38383-8c82-4143-8be7-f57f9a080255 req-123dcfa5-7a80-43a5-a796-ed0c098d11f3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:32:31 np0005603500 nova_compute[182934]: 2026-01-31 06:32:31.555 182938 DEBUG nova.network.neutron [req-34e38383-8c82-4143-8be7-f57f9a080255 req-123dcfa5-7a80-43a5-a796-ed0c098d11f3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Refreshing network info cache for port 09341075-d930-414d-afa9-4ab4a986923d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:32:31 np0005603500 nova_compute[182934]: 2026-01-31 06:32:31.903 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "refresh_cache-c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:32:32 np0005603500 nova_compute[182934]: 2026-01-31 06:32:32.625 182938 DEBUG nova.network.neutron [req-34e38383-8c82-4143-8be7-f57f9a080255 req-123dcfa5-7a80-43a5-a796-ed0c098d11f3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:32:32 np0005603500 nova_compute[182934]: 2026-01-31 06:32:32.779 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:33 np0005603500 nova_compute[182934]: 2026-01-31 06:32:33.573 182938 DEBUG nova.network.neutron [req-34e38383-8c82-4143-8be7-f57f9a080255 req-123dcfa5-7a80-43a5-a796-ed0c098d11f3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:32:34 np0005603500 nova_compute[182934]: 2026-01-31 06:32:34.084 182938 DEBUG oslo_concurrency.lockutils [req-34e38383-8c82-4143-8be7-f57f9a080255 req-123dcfa5-7a80-43a5-a796-ed0c098d11f3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:32:34 np0005603500 nova_compute[182934]: 2026-01-31 06:32:34.085 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquired lock "refresh_cache-c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:32:34 np0005603500 nova_compute[182934]: 2026-01-31 06:32:34.085 182938 DEBUG nova.network.neutron [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Jan 31 01:32:34 np0005603500 nova_compute[182934]: 2026-01-31 06:32:34.325 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:35 np0005603500 nova_compute[182934]: 2026-01-31 06:32:35.594 182938 DEBUG nova.network.neutron [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:32:36 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:36.936 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:32:37 np0005603500 nova_compute[182934]: 2026-01-31 06:32:37.782 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:39 np0005603500 nova_compute[182934]: 2026-01-31 06:32:39.327 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:39 np0005603500 nova_compute[182934]: 2026-01-31 06:32:39.614 182938 DEBUG nova.network.neutron [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Updating instance_info_cache with network_info: [{"id": "09341075-d930-414d-afa9-4ab4a986923d", "address": "fa:16:3e:ba:0b:d7", "network": {"id": "8ee2d8dd-2830-4df2-a30b-d14056986193", "bridge": "br-int", "label": "tempest-network-smoke--680207717", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09341075-d9", "ovs_interfaceid": "09341075-d930-414d-afa9-4ab4a986923d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.122 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Releasing lock "refresh_cache-c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.123 182938 DEBUG nova.compute.manager [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Instance network_info: |[{"id": "09341075-d930-414d-afa9-4ab4a986923d", "address": "fa:16:3e:ba:0b:d7", "network": {"id": "8ee2d8dd-2830-4df2-a30b-d14056986193", "bridge": "br-int", "label": "tempest-network-smoke--680207717", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09341075-d9", "ovs_interfaceid": "09341075-d930-414d-afa9-4ab4a986923d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.125 182938 DEBUG nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Start _get_guest_xml network_info=[{"id": "09341075-d930-414d-afa9-4ab4a986923d", "address": "fa:16:3e:ba:0b:d7", "network": {"id": "8ee2d8dd-2830-4df2-a30b-d14056986193", "bridge": "br-int", "label": "tempest-network-smoke--680207717", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09341075-d9", "ovs_interfaceid": "09341075-d930-414d-afa9-4ab4a986923d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': '9f613975-b701-42a0-9b35-7d5c4a2cb7f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Jan 31 01:32:40 np0005603500 podman[212250]: 2026-01-31 06:32:40.127325284 +0000 UTC m=+0.048260034 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true)
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.129 182938 WARNING nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.130 182938 DEBUG nova.virt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-1569534366', uuid='c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e'), owner=OwnerMeta(userid='dddc34b0385a49a5bd9bf081ed29e9fd', username='tempest-TestNetworkBasicOps-1355800406-project-member', projectid='829310cd8381494e96216dba067ff8d3', projectname='tempest-TestNetworkBasicOps-1355800406'), image=ImageMeta(id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "09341075-d930-414d-afa9-4ab4a986923d", "address": "fa:16:3e:ba:0b:d7", "network": {"id": "8ee2d8dd-2830-4df2-a30b-d14056986193", "bridge": "br-int", "label": "tempest-network-smoke--680207717", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09341075-d9", "ovs_interfaceid": "09341075-d930-414d-afa9-4ab4a986923d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1769841160.1302686) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.141 182938 DEBUG nova.virt.libvirt.host [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.142 182938 DEBUG nova.virt.libvirt.host [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.145 182938 DEBUG nova.virt.libvirt.host [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.145 182938 DEBUG nova.virt.libvirt.host [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.146 182938 DEBUG nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.146 182938 DEBUG nova.virt.hardware [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T06:29:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9956992e-a3ca-497f-9747-3ae270e07def',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.146 182938 DEBUG nova.virt.hardware [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.146 182938 DEBUG nova.virt.hardware [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.147 182938 DEBUG nova.virt.hardware [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.147 182938 DEBUG nova.virt.hardware [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.147 182938 DEBUG nova.virt.hardware [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.147 182938 DEBUG nova.virt.hardware [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.147 182938 DEBUG nova.virt.hardware [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.147 182938 DEBUG nova.virt.hardware [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.148 182938 DEBUG nova.virt.hardware [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.148 182938 DEBUG nova.virt.hardware [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.151 182938 DEBUG nova.virt.libvirt.vif [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:32:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1569534366',display_name='tempest-TestNetworkBasicOps-server-1569534366',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1569534366',id=2,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFIvnkkYhlCL8dIZspKySTZKjVoDlu49J+KEOnm5FsRbYW8+zdL5kZz3TsHcJBDflMqUFMnQkZfee/2xzlLGrcu4m3iYKDrcOFor6OwKTAthOVCUWOeqd3AAwSPfVrOtvw==',key_name='tempest-TestNetworkBasicOps-25733378',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-0vv2vi9p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:32:27Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09341075-d930-414d-afa9-4ab4a986923d", "address": "fa:16:3e:ba:0b:d7", "network": {"id": "8ee2d8dd-2830-4df2-a30b-d14056986193", "bridge": "br-int", "label": "tempest-network-smoke--680207717", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09341075-d9", "ovs_interfaceid": "09341075-d930-414d-afa9-4ab4a986923d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.152 182938 DEBUG nova.network.os_vif_util [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "09341075-d930-414d-afa9-4ab4a986923d", "address": "fa:16:3e:ba:0b:d7", "network": {"id": "8ee2d8dd-2830-4df2-a30b-d14056986193", "bridge": "br-int", "label": "tempest-network-smoke--680207717", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09341075-d9", "ovs_interfaceid": "09341075-d930-414d-afa9-4ab4a986923d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.152 182938 DEBUG nova.network.os_vif_util [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:0b:d7,bridge_name='br-int',has_traffic_filtering=True,id=09341075-d930-414d-afa9-4ab4a986923d,network=Network(8ee2d8dd-2830-4df2-a30b-d14056986193),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09341075-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.153 182938 DEBUG nova.objects.instance [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.662 182938 DEBUG nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] End _get_guest_xml xml=<domain type="kvm">
Jan 31 01:32:40 np0005603500 nova_compute[182934]:  <uuid>c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e</uuid>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:  <name>instance-00000002</name>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:  <memory>131072</memory>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:  <vcpu>1</vcpu>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <nova:name>tempest-TestNetworkBasicOps-server-1569534366</nova:name>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <nova:creationTime>2026-01-31 06:32:40</nova:creationTime>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <nova:flavor name="m1.nano">
Jan 31 01:32:40 np0005603500 nova_compute[182934]:        <nova:memory>128</nova:memory>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:        <nova:disk>1</nova:disk>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:        <nova:swap>0</nova:swap>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:        <nova:vcpus>1</nova:vcpus>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      </nova:flavor>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <nova:owner>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:        <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:        <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      </nova:owner>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <nova:ports>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:        <nova:port uuid="09341075-d930-414d-afa9-4ab4a986923d">
Jan 31 01:32:40 np0005603500 nova_compute[182934]:          <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:        </nova:port>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      </nova:ports>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    </nova:instance>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:  <sysinfo type="smbios">
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <entry name="manufacturer">RDO</entry>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <entry name="product">OpenStack Compute</entry>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <entry name="serial">c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e</entry>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <entry name="uuid">c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e</entry>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <entry name="family">Virtual Machine</entry>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <boot dev="hd"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <smbios mode="sysinfo"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <vmcoreinfo/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:  <clock offset="utc">
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <timer name="hpet" present="no"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:  <cpu mode="host-model" match="exact">
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <disk type="file" device="disk">
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e/disk"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <target dev="vda" bus="virtio"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <disk type="file" device="cdrom">
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <driver name="qemu" type="raw" cache="none"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e/disk.config"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <target dev="sda" bus="sata"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <interface type="ethernet">
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <mac address="fa:16:3e:ba:0b:d7"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <mtu size="1442"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <target dev="tap09341075-d9"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <serial type="pty">
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <log file="/var/lib/nova/instances/c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e/console.log" append="off"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <input type="tablet" bus="usb"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <rng model="virtio">
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <backend model="random">/dev/urandom</backend>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <controller type="usb" index="0"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    <memballoon model="virtio">
Jan 31 01:32:40 np0005603500 nova_compute[182934]:      <stats period="10"/>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:32:40 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:32:40 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:32:40 np0005603500 nova_compute[182934]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.663 182938 DEBUG nova.compute.manager [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Preparing to wait for external event network-vif-plugged-09341075-d930-414d-afa9-4ab4a986923d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.663 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.664 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.664 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.664 182938 DEBUG nova.virt.libvirt.vif [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:32:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1569534366',display_name='tempest-TestNetworkBasicOps-server-1569534366',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1569534366',id=2,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFIvnkkYhlCL8dIZspKySTZKjVoDlu49J+KEOnm5FsRbYW8+zdL5kZz3TsHcJBDflMqUFMnQkZfee/2xzlLGrcu4m3iYKDrcOFor6OwKTAthOVCUWOeqd3AAwSPfVrOtvw==',key_name='tempest-TestNetworkBasicOps-25733378',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-0vv2vi9p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:32:27Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09341075-d930-414d-afa9-4ab4a986923d", "address": "fa:16:3e:ba:0b:d7", "network": {"id": "8ee2d8dd-2830-4df2-a30b-d14056986193", "bridge": "br-int", "label": "tempest-network-smoke--680207717", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09341075-d9", "ovs_interfaceid": "09341075-d930-414d-afa9-4ab4a986923d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.665 182938 DEBUG nova.network.os_vif_util [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "09341075-d930-414d-afa9-4ab4a986923d", "address": "fa:16:3e:ba:0b:d7", "network": {"id": "8ee2d8dd-2830-4df2-a30b-d14056986193", "bridge": "br-int", "label": "tempest-network-smoke--680207717", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09341075-d9", "ovs_interfaceid": "09341075-d930-414d-afa9-4ab4a986923d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.665 182938 DEBUG nova.network.os_vif_util [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:0b:d7,bridge_name='br-int',has_traffic_filtering=True,id=09341075-d930-414d-afa9-4ab4a986923d,network=Network(8ee2d8dd-2830-4df2-a30b-d14056986193),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09341075-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.665 182938 DEBUG os_vif [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:0b:d7,bridge_name='br-int',has_traffic_filtering=True,id=09341075-d930-414d-afa9-4ab4a986923d,network=Network(8ee2d8dd-2830-4df2-a30b-d14056986193),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09341075-d9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.666 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.666 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.666 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.667 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.667 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'bce4936c-c92a-5296-80d4-e53bc57c8503', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.668 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.669 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.671 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.671 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09341075-d9, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.672 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap09341075-d9, col_values=(('qos', UUID('d4a94ce7-6416-4b32-97c8-4996c3012b69')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.672 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap09341075-d9, col_values=(('external_ids', {'iface-id': '09341075-d930-414d-afa9-4ab4a986923d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ba:0b:d7', 'vm-uuid': 'c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.673 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:40 np0005603500 NetworkManager[55506]: <info>  [1769841160.6739] manager: (tap09341075-d9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.675 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.678 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:40 np0005603500 nova_compute[182934]: 2026-01-31 06:32:40.679 182938 INFO os_vif [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:0b:d7,bridge_name='br-int',has_traffic_filtering=True,id=09341075-d930-414d-afa9-4ab4a986923d,network=Network(8ee2d8dd-2830-4df2-a30b-d14056986193),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09341075-d9')
Jan 31 01:32:42 np0005603500 podman[212271]: 2026-01-31 06:32:42.125790634 +0000 UTC m=+0.048305526 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 31 01:32:42 np0005603500 nova_compute[182934]: 2026-01-31 06:32:42.282 182938 DEBUG nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:32:42 np0005603500 nova_compute[182934]: 2026-01-31 06:32:42.282 182938 DEBUG nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:32:42 np0005603500 nova_compute[182934]: 2026-01-31 06:32:42.283 182938 DEBUG nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No VIF found with MAC fa:16:3e:ba:0b:d7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Jan 31 01:32:42 np0005603500 nova_compute[182934]: 2026-01-31 06:32:42.283 182938 INFO nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Using config drive
Jan 31 01:32:42 np0005603500 nova_compute[182934]: 2026-01-31 06:32:42.784 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:44 np0005603500 nova_compute[182934]: 2026-01-31 06:32:44.097 182938 INFO nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Creating config drive at /var/lib/nova/instances/c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e/disk.config
Jan 31 01:32:44 np0005603500 nova_compute[182934]: 2026-01-31 06:32:44.101 182938 DEBUG oslo_concurrency.processutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpztbcijc0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:32:44 np0005603500 nova_compute[182934]: 2026-01-31 06:32:44.223 182938 DEBUG oslo_concurrency.processutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpztbcijc0" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:32:44 np0005603500 kernel: tap09341075-d9: entered promiscuous mode
Jan 31 01:32:44 np0005603500 NetworkManager[55506]: <info>  [1769841164.2696] manager: (tap09341075-d9): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Jan 31 01:32:44 np0005603500 ovn_controller[95398]: 2026-01-31T06:32:44Z|00048|binding|INFO|Claiming lport 09341075-d930-414d-afa9-4ab4a986923d for this chassis.
Jan 31 01:32:44 np0005603500 ovn_controller[95398]: 2026-01-31T06:32:44Z|00049|binding|INFO|09341075-d930-414d-afa9-4ab4a986923d: Claiming fa:16:3e:ba:0b:d7 10.100.0.22
Jan 31 01:32:44 np0005603500 nova_compute[182934]: 2026-01-31 06:32:44.271 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:44 np0005603500 nova_compute[182934]: 2026-01-31 06:32:44.284 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:44 np0005603500 ovn_controller[95398]: 2026-01-31T06:32:44Z|00050|binding|INFO|Setting lport 09341075-d930-414d-afa9-4ab4a986923d ovn-installed in OVS
Jan 31 01:32:44 np0005603500 ovn_controller[95398]: 2026-01-31T06:32:44Z|00051|binding|INFO|Setting lport 09341075-d930-414d-afa9-4ab4a986923d up in Southbound
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.289 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:0b:d7 10.100.0.22'], port_security=['fa:16:3e:ba:0b:d7 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': 'c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8ee2d8dd-2830-4df2-a30b-d14056986193', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9863cfba-1576-4b88-9bab-c62b403d54ef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ea3e50b-96a6-4b10-8620-0c365db289f7, chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=09341075-d930-414d-afa9-4ab4a986923d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:32:44 np0005603500 nova_compute[182934]: 2026-01-31 06:32:44.289 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.290 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 09341075-d930-414d-afa9-4ab4a986923d in datapath 8ee2d8dd-2830-4df2-a30b-d14056986193 bound to our chassis
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.291 104644 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8ee2d8dd-2830-4df2-a30b-d14056986193
Jan 31 01:32:44 np0005603500 systemd-udevd[212310]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:32:44 np0005603500 systemd-machined[154375]: New machine qemu-2-instance-00000002.
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.300 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[10343ca3-ef82-4b1b-931b-a4ccd4718b8b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.301 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8ee2d8dd-21 in ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.302 210946 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8ee2d8dd-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.303 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[b9aef5fc-5b8f-4147-b395-169765bcc6e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.303 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[8e7c827c-8ccb-45cf-84c6-0eb41737c44e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:32:44 np0005603500 NetworkManager[55506]: <info>  [1769841164.3050] device (tap09341075-d9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:32:44 np0005603500 NetworkManager[55506]: <info>  [1769841164.3057] device (tap09341075-d9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 01:32:44 np0005603500 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.318 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[ca99f611-f12c-4fb1-9c18-1d0de6bbd0e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.340 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[a5b15f82-7ebd-4fd0-8106-a55bdb329e41]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.360 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[c450d6a7-49fe-4dde-b62d-dd7b27fb0d67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.364 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[660a66a1-3d41-45ef-9de0-d313430625d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:32:44 np0005603500 NetworkManager[55506]: <info>  [1769841164.3656] manager: (tap8ee2d8dd-20): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.397 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[980043e7-5a2c-4104-84b5-078c2bddc7d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.401 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[48bbe476-1c1c-4c9e-a107-ebef5b6e0be8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:32:44 np0005603500 NetworkManager[55506]: <info>  [1769841164.4189] device (tap8ee2d8dd-20): carrier: link connected
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.421 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[856335ca-3f26-4e41-9dab-8ef3283eb699]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.434 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[dd8e7e15-c9a9-4fc9-879c-67209f3d5ef4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8ee2d8dd-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:83:8c:df'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347571, 'reachable_time': 24603, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212345, 'error': None, 'target': 'ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.445 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[8b5165b0-a4a0-4f72-a6c1-5db6e6068f52]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe83:8cdf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347571, 'tstamp': 347571}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212346, 'error': None, 'target': 'ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.456 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[0d33580a-16d7-40bc-abdc-ce8b0f9a046e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8ee2d8dd-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:83:8c:df'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347571, 'reachable_time': 24603, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212347, 'error': None, 'target': 'ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.476 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[6a174154-f09e-4ff1-81fb-b6bebb8565f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.514 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[2d388241-a984-46a3-a54f-29db898c7f66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.515 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8ee2d8dd-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.516 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.516 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8ee2d8dd-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:32:44 np0005603500 NetworkManager[55506]: <info>  [1769841164.5183] manager: (tap8ee2d8dd-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Jan 31 01:32:44 np0005603500 kernel: tap8ee2d8dd-20: entered promiscuous mode
Jan 31 01:32:44 np0005603500 nova_compute[182934]: 2026-01-31 06:32:44.519 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.520 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8ee2d8dd-20, col_values=(('external_ids', {'iface-id': '819f4165-cac3-47d8-a691-9a70a2b6604f'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:32:44 np0005603500 ovn_controller[95398]: 2026-01-31T06:32:44Z|00052|binding|INFO|Releasing lport 819f4165-cac3-47d8-a691-9a70a2b6604f from this chassis (sb_readonly=0)
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.523 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[41065f83-edab-4c3b-81db-14db44422d08]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.524 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8ee2d8dd-2830-4df2-a30b-d14056986193.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8ee2d8dd-2830-4df2-a30b-d14056986193.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.524 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8ee2d8dd-2830-4df2-a30b-d14056986193.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8ee2d8dd-2830-4df2-a30b-d14056986193.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.524 104644 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 8ee2d8dd-2830-4df2-a30b-d14056986193 disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.524 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8ee2d8dd-2830-4df2-a30b-d14056986193.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8ee2d8dd-2830-4df2-a30b-d14056986193.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:32:44 np0005603500 nova_compute[182934]: 2026-01-31 06:32:44.525 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.525 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[68ac6a77-b083-4a51-b0de-c29b5fc4a943]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.526 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8ee2d8dd-2830-4df2-a30b-d14056986193.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8ee2d8dd-2830-4df2-a30b-d14056986193.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.526 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[398655ef-7877-46e8-b90c-88fb194c539d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.527 104644 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: global
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    log         /dev/log local0 debug
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    log-tag     haproxy-metadata-proxy-8ee2d8dd-2830-4df2-a30b-d14056986193
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    user        root
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    group       root
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    maxconn     1024
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    pidfile     /var/lib/neutron/external/pids/8ee2d8dd-2830-4df2-a30b-d14056986193.pid.haproxy
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    daemon
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: defaults
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    log global
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    mode http
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    option httplog
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    option dontlognull
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    option http-server-close
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    option forwardfor
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    retries                 3
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    timeout http-request    30s
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    timeout connect         30s
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    timeout client          32s
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    timeout server          32s
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    timeout http-keep-alive 30s
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: listen listener
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    bind 169.254.169.254:80
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]:    http-request add-header X-OVN-Network-ID 8ee2d8dd-2830-4df2-a30b-d14056986193
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 31 01:32:44 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:44.527 104644 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193', 'env', 'PROCESS_TAG=haproxy-8ee2d8dd-2830-4df2-a30b-d14056986193', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8ee2d8dd-2830-4df2-a30b-d14056986193.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Jan 31 01:32:44 np0005603500 podman[212381]: 2026-01-31 06:32:44.902961796 +0000 UTC m=+0.095935469 container create 7a5cc4c38cc6bdf4b4f59b94966a3378c5db9801ffe17ddce2f6f31a3c309a14 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Jan 31 01:32:44 np0005603500 podman[212381]: 2026-01-31 06:32:44.833981884 +0000 UTC m=+0.026955587 image pull d52ce0b189025039ce86fc9564595bcce243e95c598f912f021ea09cd4116a16 quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:32:44 np0005603500 systemd[1]: Started libpod-conmon-7a5cc4c38cc6bdf4b4f59b94966a3378c5db9801ffe17ddce2f6f31a3c309a14.scope.
Jan 31 01:32:44 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:32:44 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/044c4421412cf2430d1e56a9ac9b5a8f035d36e5842079f51779aea063b13f35/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 01:32:44 np0005603500 podman[212381]: 2026-01-31 06:32:44.969417659 +0000 UTC m=+0.162391372 container init 7a5cc4c38cc6bdf4b4f59b94966a3378c5db9801ffe17ddce2f6f31a3c309a14 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:32:44 np0005603500 podman[212381]: 2026-01-31 06:32:44.97509782 +0000 UTC m=+0.168071503 container start 7a5cc4c38cc6bdf4b4f59b94966a3378c5db9801ffe17ddce2f6f31a3c309a14 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:32:44 np0005603500 neutron-haproxy-ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193[212402]: [NOTICE]   (212408) : New worker (212410) forked
Jan 31 01:32:44 np0005603500 neutron-haproxy-ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193[212402]: [NOTICE]   (212408) : Loading success.
Jan 31 01:32:45 np0005603500 nova_compute[182934]: 2026-01-31 06:32:45.675 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:45 np0005603500 nova_compute[182934]: 2026-01-31 06:32:45.737 182938 DEBUG nova.compute.manager [req-2883fd6c-f716-43a2-9989-78c5c8b9c113 req-1d4f3f6a-ebbb-4eb2-840d-f0a47bbcebce 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Received event network-vif-plugged-09341075-d930-414d-afa9-4ab4a986923d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:32:45 np0005603500 nova_compute[182934]: 2026-01-31 06:32:45.738 182938 DEBUG oslo_concurrency.lockutils [req-2883fd6c-f716-43a2-9989-78c5c8b9c113 req-1d4f3f6a-ebbb-4eb2-840d-f0a47bbcebce 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:32:45 np0005603500 nova_compute[182934]: 2026-01-31 06:32:45.738 182938 DEBUG oslo_concurrency.lockutils [req-2883fd6c-f716-43a2-9989-78c5c8b9c113 req-1d4f3f6a-ebbb-4eb2-840d-f0a47bbcebce 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:32:45 np0005603500 nova_compute[182934]: 2026-01-31 06:32:45.739 182938 DEBUG oslo_concurrency.lockutils [req-2883fd6c-f716-43a2-9989-78c5c8b9c113 req-1d4f3f6a-ebbb-4eb2-840d-f0a47bbcebce 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:32:45 np0005603500 nova_compute[182934]: 2026-01-31 06:32:45.739 182938 DEBUG nova.compute.manager [req-2883fd6c-f716-43a2-9989-78c5c8b9c113 req-1d4f3f6a-ebbb-4eb2-840d-f0a47bbcebce 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Processing event network-vif-plugged-09341075-d930-414d-afa9-4ab4a986923d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Jan 31 01:32:45 np0005603500 nova_compute[182934]: 2026-01-31 06:32:45.740 182938 DEBUG nova.compute.manager [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Jan 31 01:32:45 np0005603500 nova_compute[182934]: 2026-01-31 06:32:45.746 182938 DEBUG nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Jan 31 01:32:45 np0005603500 nova_compute[182934]: 2026-01-31 06:32:45.752 182938 INFO nova.virt.libvirt.driver [-] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Instance spawned successfully.
Jan 31 01:32:45 np0005603500 nova_compute[182934]: 2026-01-31 06:32:45.753 182938 DEBUG nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Jan 31 01:32:46 np0005603500 nova_compute[182934]: 2026-01-31 06:32:46.403 182938 DEBUG nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:32:46 np0005603500 nova_compute[182934]: 2026-01-31 06:32:46.403 182938 DEBUG nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:32:46 np0005603500 nova_compute[182934]: 2026-01-31 06:32:46.404 182938 DEBUG nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:32:46 np0005603500 nova_compute[182934]: 2026-01-31 06:32:46.404 182938 DEBUG nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:32:46 np0005603500 nova_compute[182934]: 2026-01-31 06:32:46.405 182938 DEBUG nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:32:46 np0005603500 nova_compute[182934]: 2026-01-31 06:32:46.405 182938 DEBUG nova.virt.libvirt.driver [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:32:46 np0005603500 nova_compute[182934]: 2026-01-31 06:32:46.967 182938 INFO nova.compute.manager [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Took 17.92 seconds to spawn the instance on the hypervisor.
Jan 31 01:32:46 np0005603500 nova_compute[182934]: 2026-01-31 06:32:46.968 182938 DEBUG nova.compute.manager [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Jan 31 01:32:47 np0005603500 nova_compute[182934]: 2026-01-31 06:32:47.535 182938 INFO nova.compute.manager [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Took 24.60 seconds to build instance.
Jan 31 01:32:47 np0005603500 nova_compute[182934]: 2026-01-31 06:32:47.786 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:48 np0005603500 podman[212434]: 2026-01-31 06:32:48.158270585 +0000 UTC m=+0.076478882 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, version=9.7, build-date=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public)
Jan 31 01:32:48 np0005603500 nova_compute[182934]: 2026-01-31 06:32:48.171 182938 DEBUG nova.compute.manager [req-2b5c8704-6655-41a5-8fc9-07cefa899f0f req-edc2ba86-ce7c-4892-ab25-5d10154f550e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Received event network-vif-plugged-09341075-d930-414d-afa9-4ab4a986923d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:32:48 np0005603500 nova_compute[182934]: 2026-01-31 06:32:48.171 182938 DEBUG oslo_concurrency.lockutils [req-2b5c8704-6655-41a5-8fc9-07cefa899f0f req-edc2ba86-ce7c-4892-ab25-5d10154f550e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:32:48 np0005603500 nova_compute[182934]: 2026-01-31 06:32:48.172 182938 DEBUG oslo_concurrency.lockutils [req-2b5c8704-6655-41a5-8fc9-07cefa899f0f req-edc2ba86-ce7c-4892-ab25-5d10154f550e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:32:48 np0005603500 nova_compute[182934]: 2026-01-31 06:32:48.172 182938 DEBUG oslo_concurrency.lockutils [req-2b5c8704-6655-41a5-8fc9-07cefa899f0f req-edc2ba86-ce7c-4892-ab25-5d10154f550e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:32:48 np0005603500 nova_compute[182934]: 2026-01-31 06:32:48.172 182938 DEBUG nova.compute.manager [req-2b5c8704-6655-41a5-8fc9-07cefa899f0f req-edc2ba86-ce7c-4892-ab25-5d10154f550e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] No waiting events found dispatching network-vif-plugged-09341075-d930-414d-afa9-4ab4a986923d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:32:48 np0005603500 nova_compute[182934]: 2026-01-31 06:32:48.172 182938 WARNING nova.compute.manager [req-2b5c8704-6655-41a5-8fc9-07cefa899f0f req-edc2ba86-ce7c-4892-ab25-5d10154f550e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Received unexpected event network-vif-plugged-09341075-d930-414d-afa9-4ab4a986923d for instance with vm_state active and task_state None.
Jan 31 01:32:48 np0005603500 nova_compute[182934]: 2026-01-31 06:32:48.227 182938 DEBUG oslo_concurrency.lockutils [None req-158d301a-b28e-4244-939a-6b5875e4c3db dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 26.472s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:32:50 np0005603500 nova_compute[182934]: 2026-01-31 06:32:50.678 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:52 np0005603500 podman[212455]: 2026-01-31 06:32:52.16853369 +0000 UTC m=+0.085076284 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 01:32:52 np0005603500 nova_compute[182934]: 2026-01-31 06:32:52.838 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:55.602 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:32:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:55.603 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:32:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:32:55.603 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:32:55 np0005603500 nova_compute[182934]: 2026-01-31 06:32:55.680 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:57 np0005603500 podman[212486]: 2026-01-31 06:32:57.148373794 +0000 UTC m=+0.060851205 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute)
Jan 31 01:32:57 np0005603500 nova_compute[182934]: 2026-01-31 06:32:57.879 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:32:59 np0005603500 podman[212506]: 2026-01-31 06:32:59.146777314 +0000 UTC m=+0.064428519 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 31 01:33:00 np0005603500 nova_compute[182934]: 2026-01-31 06:33:00.684 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:02 np0005603500 ovn_controller[95398]: 2026-01-31T06:33:02Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ba:0b:d7 10.100.0.22
Jan 31 01:33:02 np0005603500 ovn_controller[95398]: 2026-01-31T06:33:02Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ba:0b:d7 10.100.0.22
Jan 31 01:33:02 np0005603500 nova_compute[182934]: 2026-01-31 06:33:02.880 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:05 np0005603500 nova_compute[182934]: 2026-01-31 06:33:05.724 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:07 np0005603500 nova_compute[182934]: 2026-01-31 06:33:07.884 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:10 np0005603500 nova_compute[182934]: 2026-01-31 06:33:10.727 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:10 np0005603500 nova_compute[182934]: 2026-01-31 06:33:10.876 182938 DEBUG oslo_concurrency.lockutils [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:33:10 np0005603500 nova_compute[182934]: 2026-01-31 06:33:10.876 182938 DEBUG oslo_concurrency.lockutils [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:33:10 np0005603500 nova_compute[182934]: 2026-01-31 06:33:10.877 182938 DEBUG oslo_concurrency.lockutils [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:33:10 np0005603500 nova_compute[182934]: 2026-01-31 06:33:10.877 182938 DEBUG oslo_concurrency.lockutils [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:33:10 np0005603500 nova_compute[182934]: 2026-01-31 06:33:10.877 182938 DEBUG oslo_concurrency.lockutils [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:33:10 np0005603500 nova_compute[182934]: 2026-01-31 06:33:10.879 182938 INFO nova.compute.manager [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Terminating instance
Jan 31 01:33:11 np0005603500 podman[212543]: 2026-01-31 06:33:11.154715036 +0000 UTC m=+0.071581986 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 31 01:33:11 np0005603500 nova_compute[182934]: 2026-01-31 06:33:11.386 182938 DEBUG nova.compute.manager [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Jan 31 01:33:11 np0005603500 kernel: tap09341075-d9 (unregistering): left promiscuous mode
Jan 31 01:33:11 np0005603500 NetworkManager[55506]: <info>  [1769841191.4074] device (tap09341075-d9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 01:33:11 np0005603500 nova_compute[182934]: 2026-01-31 06:33:11.414 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:11 np0005603500 ovn_controller[95398]: 2026-01-31T06:33:11Z|00053|binding|INFO|Releasing lport 09341075-d930-414d-afa9-4ab4a986923d from this chassis (sb_readonly=0)
Jan 31 01:33:11 np0005603500 ovn_controller[95398]: 2026-01-31T06:33:11Z|00054|binding|INFO|Setting lport 09341075-d930-414d-afa9-4ab4a986923d down in Southbound
Jan 31 01:33:11 np0005603500 ovn_controller[95398]: 2026-01-31T06:33:11Z|00055|binding|INFO|Removing iface tap09341075-d9 ovn-installed in OVS
Jan 31 01:33:11 np0005603500 nova_compute[182934]: 2026-01-31 06:33:11.417 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:11 np0005603500 nova_compute[182934]: 2026-01-31 06:33:11.421 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:11 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:11.424 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:0b:d7 10.100.0.22'], port_security=['fa:16:3e:ba:0b:d7 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': 'c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8ee2d8dd-2830-4df2-a30b-d14056986193', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '9863cfba-1576-4b88-9bab-c62b403d54ef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ea3e50b-96a6-4b10-8620-0c365db289f7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=09341075-d930-414d-afa9-4ab4a986923d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:33:11 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:11.425 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 09341075-d930-414d-afa9-4ab4a986923d in datapath 8ee2d8dd-2830-4df2-a30b-d14056986193 unbound from our chassis
Jan 31 01:33:11 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:11.426 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8ee2d8dd-2830-4df2-a30b-d14056986193, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:33:11 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:11.427 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[e948e596-0d7d-4ac7-8a16-432af4217132]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:33:11 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:11.428 104644 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193 namespace which is not needed anymore
Jan 31 01:33:11 np0005603500 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Jan 31 01:33:11 np0005603500 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 12.682s CPU time.
Jan 31 01:33:11 np0005603500 systemd-machined[154375]: Machine qemu-2-instance-00000002 terminated.
Jan 31 01:33:11 np0005603500 neutron-haproxy-ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193[212402]: [NOTICE]   (212408) : haproxy version is 2.8.14-c23fe91
Jan 31 01:33:11 np0005603500 neutron-haproxy-ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193[212402]: [NOTICE]   (212408) : path to executable is /usr/sbin/haproxy
Jan 31 01:33:11 np0005603500 neutron-haproxy-ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193[212402]: [WARNING]  (212408) : Exiting Master process...
Jan 31 01:33:11 np0005603500 podman[212585]: 2026-01-31 06:33:11.526590996 +0000 UTC m=+0.025272614 container kill 7a5cc4c38cc6bdf4b4f59b94966a3378c5db9801ffe17ddce2f6f31a3c309a14 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Jan 31 01:33:11 np0005603500 neutron-haproxy-ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193[212402]: [ALERT]    (212408) : Current worker (212410) exited with code 143 (Terminated)
Jan 31 01:33:11 np0005603500 neutron-haproxy-ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193[212402]: [WARNING]  (212408) : All workers exited. Exiting... (0)
Jan 31 01:33:11 np0005603500 systemd[1]: libpod-7a5cc4c38cc6bdf4b4f59b94966a3378c5db9801ffe17ddce2f6f31a3c309a14.scope: Deactivated successfully.
Jan 31 01:33:11 np0005603500 conmon[212402]: conmon 7a5cc4c38cc6bdf4b4f5 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-7a5cc4c38cc6bdf4b4f59b94966a3378c5db9801ffe17ddce2f6f31a3c309a14.scope/container/memory.events
Jan 31 01:33:11 np0005603500 nova_compute[182934]: 2026-01-31 06:33:11.609 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:11 np0005603500 nova_compute[182934]: 2026-01-31 06:33:11.612 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:11 np0005603500 nova_compute[182934]: 2026-01-31 06:33:11.641 182938 INFO nova.virt.libvirt.driver [-] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Instance destroyed successfully.
Jan 31 01:33:11 np0005603500 nova_compute[182934]: 2026-01-31 06:33:11.642 182938 DEBUG nova.objects.instance [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'resources' on Instance uuid c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:33:11 np0005603500 podman[212600]: 2026-01-31 06:33:11.678246996 +0000 UTC m=+0.134474955 container died 7a5cc4c38cc6bdf4b4f59b94966a3378c5db9801ffe17ddce2f6f31a3c309a14 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:33:11 np0005603500 nova_compute[182934]: 2026-01-31 06:33:11.835 182938 DEBUG nova.compute.manager [req-01931ef4-3379-476f-bd8f-3a3f299e21cf req-8420c524-2da4-4c75-b720-257931a15f53 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Received event network-vif-unplugged-09341075-d930-414d-afa9-4ab4a986923d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:33:11 np0005603500 nova_compute[182934]: 2026-01-31 06:33:11.836 182938 DEBUG oslo_concurrency.lockutils [req-01931ef4-3379-476f-bd8f-3a3f299e21cf req-8420c524-2da4-4c75-b720-257931a15f53 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:33:11 np0005603500 nova_compute[182934]: 2026-01-31 06:33:11.836 182938 DEBUG oslo_concurrency.lockutils [req-01931ef4-3379-476f-bd8f-3a3f299e21cf req-8420c524-2da4-4c75-b720-257931a15f53 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:33:11 np0005603500 nova_compute[182934]: 2026-01-31 06:33:11.837 182938 DEBUG oslo_concurrency.lockutils [req-01931ef4-3379-476f-bd8f-3a3f299e21cf req-8420c524-2da4-4c75-b720-257931a15f53 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:33:11 np0005603500 nova_compute[182934]: 2026-01-31 06:33:11.837 182938 DEBUG nova.compute.manager [req-01931ef4-3379-476f-bd8f-3a3f299e21cf req-8420c524-2da4-4c75-b720-257931a15f53 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] No waiting events found dispatching network-vif-unplugged-09341075-d930-414d-afa9-4ab4a986923d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:33:11 np0005603500 nova_compute[182934]: 2026-01-31 06:33:11.838 182938 DEBUG nova.compute.manager [req-01931ef4-3379-476f-bd8f-3a3f299e21cf req-8420c524-2da4-4c75-b720-257931a15f53 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Received event network-vif-unplugged-09341075-d930-414d-afa9-4ab4a986923d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Jan 31 01:33:11 np0005603500 systemd[1]: var-lib-containers-storage-overlay-044c4421412cf2430d1e56a9ac9b5a8f035d36e5842079f51779aea063b13f35-merged.mount: Deactivated successfully.
Jan 31 01:33:11 np0005603500 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7a5cc4c38cc6bdf4b4f59b94966a3378c5db9801ffe17ddce2f6f31a3c309a14-userdata-shm.mount: Deactivated successfully.
Jan 31 01:33:12 np0005603500 podman[212600]: 2026-01-31 06:33:12.065899888 +0000 UTC m=+0.522127837 container cleanup 7a5cc4c38cc6bdf4b4f59b94966a3378c5db9801ffe17ddce2f6f31a3c309a14 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:33:12 np0005603500 systemd[1]: libpod-conmon-7a5cc4c38cc6bdf4b4f59b94966a3378c5db9801ffe17ddce2f6f31a3c309a14.scope: Deactivated successfully.
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.170 182938 DEBUG nova.virt.libvirt.vif [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:32:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1569534366',display_name='tempest-TestNetworkBasicOps-server-1569534366',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1569534366',id=2,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFIvnkkYhlCL8dIZspKySTZKjVoDlu49J+KEOnm5FsRbYW8+zdL5kZz3TsHcJBDflMqUFMnQkZfee/2xzlLGrcu4m3iYKDrcOFor6OwKTAthOVCUWOeqd3AAwSPfVrOtvw==',key_name='tempest-TestNetworkBasicOps-25733378',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:32:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-0vv2vi9p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:32:47Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "09341075-d930-414d-afa9-4ab4a986923d", "address": "fa:16:3e:ba:0b:d7", "network": {"id": "8ee2d8dd-2830-4df2-a30b-d14056986193", "bridge": "br-int", "label": "tempest-network-smoke--680207717", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09341075-d9", "ovs_interfaceid": "09341075-d930-414d-afa9-4ab4a986923d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.170 182938 DEBUG nova.network.os_vif_util [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "09341075-d930-414d-afa9-4ab4a986923d", "address": "fa:16:3e:ba:0b:d7", "network": {"id": "8ee2d8dd-2830-4df2-a30b-d14056986193", "bridge": "br-int", "label": "tempest-network-smoke--680207717", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09341075-d9", "ovs_interfaceid": "09341075-d930-414d-afa9-4ab4a986923d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.171 182938 DEBUG nova.network.os_vif_util [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:0b:d7,bridge_name='br-int',has_traffic_filtering=True,id=09341075-d930-414d-afa9-4ab4a986923d,network=Network(8ee2d8dd-2830-4df2-a30b-d14056986193),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09341075-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.171 182938 DEBUG os_vif [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:0b:d7,bridge_name='br-int',has_traffic_filtering=True,id=09341075-d930-414d-afa9-4ab4a986923d,network=Network(8ee2d8dd-2830-4df2-a30b-d14056986193),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09341075-d9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.173 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.173 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09341075-d9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.175 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.177 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.178 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.178 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=d4a94ce7-6416-4b32-97c8-4996c3012b69) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.179 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.180 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.181 182938 INFO os_vif [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:0b:d7,bridge_name='br-int',has_traffic_filtering=True,id=09341075-d930-414d-afa9-4ab4a986923d,network=Network(8ee2d8dd-2830-4df2-a30b-d14056986193),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09341075-d9')
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.182 182938 INFO nova.virt.libvirt.driver [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Deleting instance files /var/lib/nova/instances/c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e_del
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.182 182938 INFO nova.virt.libvirt.driver [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Deletion of /var/lib/nova/instances/c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e_del complete
Jan 31 01:33:12 np0005603500 podman[212629]: 2026-01-31 06:33:12.199003858 +0000 UTC m=+0.524685397 container remove 7a5cc4c38cc6bdf4b4f59b94966a3378c5db9801ffe17ddce2f6f31a3c309a14 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 01:33:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:12.203 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[e777432d-2ae3-4c58-854f-de8fcd756f2a]: (4, ("Sat Jan 31 06:33:11 AM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193 (7a5cc4c38cc6bdf4b4f59b94966a3378c5db9801ffe17ddce2f6f31a3c309a14)\n7a5cc4c38cc6bdf4b4f59b94966a3378c5db9801ffe17ddce2f6f31a3c309a14\nSat Jan 31 06:33:11 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193 (7a5cc4c38cc6bdf4b4f59b94966a3378c5db9801ffe17ddce2f6f31a3c309a14)\n7a5cc4c38cc6bdf4b4f59b94966a3378c5db9801ffe17ddce2f6f31a3c309a14\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:33:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:12.205 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[d577b345-8031-4196-9b09-743df6e77de6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:33:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:12.205 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8ee2d8dd-2830-4df2-a30b-d14056986193.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8ee2d8dd-2830-4df2-a30b-d14056986193.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:33:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:12.206 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[e3e57ce0-763b-4876-b7c6-c8e46977c3c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:33:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:12.207 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8ee2d8dd-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.210 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:12 np0005603500 kernel: tap8ee2d8dd-20: left promiscuous mode
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.215 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:12.217 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[d487670b-78c8-4271-a1f1-3ec892b097ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:33:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:12.240 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[c24df065-5f2f-493b-ba89-0d7a899f3c8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:33:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:12.241 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[ea4051ce-f27a-4530-bbcf-f38d4410cc2c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:33:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:12.253 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[b62ebb85-5cef-4a3b-98e7-9a4c34e93801]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347565, 'reachable_time': 19707, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212659, 'error': None, 'target': 'ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:33:12 np0005603500 systemd[1]: run-netns-ovnmeta\x2d8ee2d8dd\x2d2830\x2d4df2\x2da30b\x2dd14056986193.mount: Deactivated successfully.
Jan 31 01:33:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:12.262 105168 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8ee2d8dd-2830-4df2-a30b-d14056986193 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 31 01:33:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:12.263 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[256344bf-e91e-4079-b5d9-778546d89e66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:33:12 np0005603500 podman[212645]: 2026-01-31 06:33:12.27392219 +0000 UTC m=+0.045909661 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.693 182938 INFO nova.compute.manager [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Took 1.31 seconds to destroy the instance on the hypervisor.
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.695 182938 DEBUG oslo.service.backend.eventlet.loopingcall [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.695 182938 DEBUG nova.compute.manager [-] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.696 182938 DEBUG nova.network.neutron [-] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Jan 31 01:33:12 np0005603500 nova_compute[182934]: 2026-01-31 06:33:12.886 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:14 np0005603500 nova_compute[182934]: 2026-01-31 06:33:14.143 182938 DEBUG nova.compute.manager [req-e1aba7bd-baf7-463f-8e79-b8e2a337c81e req-182f5815-9af5-42ad-ae98-b8352f2a62d4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Received event network-vif-plugged-09341075-d930-414d-afa9-4ab4a986923d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:33:14 np0005603500 nova_compute[182934]: 2026-01-31 06:33:14.143 182938 DEBUG oslo_concurrency.lockutils [req-e1aba7bd-baf7-463f-8e79-b8e2a337c81e req-182f5815-9af5-42ad-ae98-b8352f2a62d4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:33:14 np0005603500 nova_compute[182934]: 2026-01-31 06:33:14.144 182938 DEBUG oslo_concurrency.lockutils [req-e1aba7bd-baf7-463f-8e79-b8e2a337c81e req-182f5815-9af5-42ad-ae98-b8352f2a62d4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:33:14 np0005603500 nova_compute[182934]: 2026-01-31 06:33:14.144 182938 DEBUG oslo_concurrency.lockutils [req-e1aba7bd-baf7-463f-8e79-b8e2a337c81e req-182f5815-9af5-42ad-ae98-b8352f2a62d4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:33:14 np0005603500 nova_compute[182934]: 2026-01-31 06:33:14.144 182938 DEBUG nova.compute.manager [req-e1aba7bd-baf7-463f-8e79-b8e2a337c81e req-182f5815-9af5-42ad-ae98-b8352f2a62d4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] No waiting events found dispatching network-vif-plugged-09341075-d930-414d-afa9-4ab4a986923d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:33:14 np0005603500 nova_compute[182934]: 2026-01-31 06:33:14.144 182938 WARNING nova.compute.manager [req-e1aba7bd-baf7-463f-8e79-b8e2a337c81e req-182f5815-9af5-42ad-ae98-b8352f2a62d4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Received unexpected event network-vif-plugged-09341075-d930-414d-afa9-4ab4a986923d for instance with vm_state active and task_state deleting.
Jan 31 01:33:16 np0005603500 nova_compute[182934]: 2026-01-31 06:33:16.710 182938 DEBUG nova.network.neutron [-] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:33:16 np0005603500 nova_compute[182934]: 2026-01-31 06:33:16.720 182938 DEBUG nova.compute.manager [req-57dd66d8-6bcf-441c-8641-c60602a3562b req-1e2b1ad2-50dc-4484-a5a2-7e8e78ba4d64 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Received event network-vif-deleted-09341075-d930-414d-afa9-4ab4a986923d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:33:16 np0005603500 nova_compute[182934]: 2026-01-31 06:33:16.721 182938 INFO nova.compute.manager [req-57dd66d8-6bcf-441c-8641-c60602a3562b req-1e2b1ad2-50dc-4484-a5a2-7e8e78ba4d64 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Neutron deleted interface 09341075-d930-414d-afa9-4ab4a986923d; detaching it from the instance and deleting it from the info cache
Jan 31 01:33:16 np0005603500 nova_compute[182934]: 2026-01-31 06:33:16.721 182938 DEBUG nova.network.neutron [req-57dd66d8-6bcf-441c-8641-c60602a3562b req-1e2b1ad2-50dc-4484-a5a2-7e8e78ba4d64 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:33:17 np0005603500 nova_compute[182934]: 2026-01-31 06:33:17.180 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:17 np0005603500 nova_compute[182934]: 2026-01-31 06:33:17.869 182938 INFO nova.compute.manager [-] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Took 5.17 seconds to deallocate network for instance.
Jan 31 01:33:17 np0005603500 nova_compute[182934]: 2026-01-31 06:33:17.888 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:17 np0005603500 nova_compute[182934]: 2026-01-31 06:33:17.963 182938 DEBUG nova.compute.manager [req-57dd66d8-6bcf-441c-8641-c60602a3562b req-1e2b1ad2-50dc-4484-a5a2-7e8e78ba4d64 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e] Detach interface failed, port_id=09341075-d930-414d-afa9-4ab4a986923d, reason: Instance c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11571
Jan 31 01:33:18 np0005603500 nova_compute[182934]: 2026-01-31 06:33:18.382 182938 DEBUG oslo_concurrency.lockutils [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:33:18 np0005603500 nova_compute[182934]: 2026-01-31 06:33:18.383 182938 DEBUG oslo_concurrency.lockutils [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:33:18 np0005603500 nova_compute[182934]: 2026-01-31 06:33:18.475 182938 DEBUG nova.compute.provider_tree [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:33:18 np0005603500 nova_compute[182934]: 2026-01-31 06:33:18.983 182938 DEBUG nova.scheduler.client.report [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:33:19 np0005603500 podman[212672]: 2026-01-31 06:33:19.141045864 +0000 UTC m=+0.058952255 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.7, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9/ubi-minimal)
Jan 31 01:33:19 np0005603500 nova_compute[182934]: 2026-01-31 06:33:19.500 182938 DEBUG oslo_concurrency.lockutils [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:33:19 np0005603500 nova_compute[182934]: 2026-01-31 06:33:19.533 182938 INFO nova.scheduler.client.report [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Deleted allocations for instance c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e
Jan 31 01:33:20 np0005603500 nova_compute[182934]: 2026-01-31 06:33:20.562 182938 DEBUG oslo_concurrency.lockutils [None req-5bb533ab-18ad-4731-9ae3-922843882778 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "c1ef6680-57fa-4cfb-b9ce-c0cca2f53d0e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.686s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:33:22 np0005603500 nova_compute[182934]: 2026-01-31 06:33:22.182 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:22 np0005603500 nova_compute[182934]: 2026-01-31 06:33:22.890 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:23 np0005603500 podman[212693]: 2026-01-31 06:33:23.19134203 +0000 UTC m=+0.111916867 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 31 01:33:24 np0005603500 nova_compute[182934]: 2026-01-31 06:33:24.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:33:24 np0005603500 ovn_controller[95398]: 2026-01-31T06:33:24Z|00056|binding|INFO|Releasing lport bbc7e9d4-0e42-4731-a0a1-d6912b3f33d9 from this chassis (sb_readonly=0)
Jan 31 01:33:24 np0005603500 nova_compute[182934]: 2026-01-31 06:33:24.402 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:24 np0005603500 nova_compute[182934]: 2026-01-31 06:33:24.660 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:33:24 np0005603500 nova_compute[182934]: 2026-01-31 06:33:24.660 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:33:24 np0005603500 nova_compute[182934]: 2026-01-31 06:33:24.660 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:33:24 np0005603500 nova_compute[182934]: 2026-01-31 06:33:24.661 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:33:25 np0005603500 nova_compute[182934]: 2026-01-31 06:33:25.701 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:33:25 np0005603500 nova_compute[182934]: 2026-01-31 06:33:25.744 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:33:25 np0005603500 nova_compute[182934]: 2026-01-31 06:33:25.745 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:33:25 np0005603500 nova_compute[182934]: 2026-01-31 06:33:25.788 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:33:25 np0005603500 nova_compute[182934]: 2026-01-31 06:33:25.898 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:33:25 np0005603500 nova_compute[182934]: 2026-01-31 06:33:25.899 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5613MB free_disk=73.18716812133789GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:33:25 np0005603500 nova_compute[182934]: 2026-01-31 06:33:25.900 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:33:25 np0005603500 nova_compute[182934]: 2026-01-31 06:33:25.900 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:33:26 np0005603500 nova_compute[182934]: 2026-01-31 06:33:26.439 182938 DEBUG nova.compute.manager [req-8358bd1e-4e17-44b4-a98f-a31c18a1b252 req-7b12ea24-fe04-4c4c-9467-9ceb627f0cf3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Received event network-changed-57786dbf-8d90-4cbe-834d-3cd072a75d1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:33:26 np0005603500 nova_compute[182934]: 2026-01-31 06:33:26.440 182938 DEBUG nova.compute.manager [req-8358bd1e-4e17-44b4-a98f-a31c18a1b252 req-7b12ea24-fe04-4c4c-9467-9ceb627f0cf3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Refreshing instance network info cache due to event network-changed-57786dbf-8d90-4cbe-834d-3cd072a75d1f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:33:26 np0005603500 nova_compute[182934]: 2026-01-31 06:33:26.440 182938 DEBUG oslo_concurrency.lockutils [req-8358bd1e-4e17-44b4-a98f-a31c18a1b252 req-7b12ea24-fe04-4c4c-9467-9ceb627f0cf3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-7dedc0e6-e769-4fda-b465-152126c73743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:33:26 np0005603500 nova_compute[182934]: 2026-01-31 06:33:26.440 182938 DEBUG oslo_concurrency.lockutils [req-8358bd1e-4e17-44b4-a98f-a31c18a1b252 req-7b12ea24-fe04-4c4c-9467-9ceb627f0cf3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-7dedc0e6-e769-4fda-b465-152126c73743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:33:26 np0005603500 nova_compute[182934]: 2026-01-31 06:33:26.440 182938 DEBUG nova.network.neutron [req-8358bd1e-4e17-44b4-a98f-a31c18a1b252 req-7b12ea24-fe04-4c4c-9467-9ceb627f0cf3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Refreshing network info cache for port 57786dbf-8d90-4cbe-834d-3cd072a75d1f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:33:26 np0005603500 nova_compute[182934]: 2026-01-31 06:33:26.951 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Instance 7dedc0e6-e769-4fda-b465-152126c73743 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Jan 31 01:33:26 np0005603500 nova_compute[182934]: 2026-01-31 06:33:26.952 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:33:26 np0005603500 nova_compute[182934]: 2026-01-31 06:33:26.952 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.004 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.031 182938 DEBUG oslo_concurrency.lockutils [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "7dedc0e6-e769-4fda-b465-152126c73743" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.032 182938 DEBUG oslo_concurrency.lockutils [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "7dedc0e6-e769-4fda-b465-152126c73743" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.034 182938 DEBUG oslo_concurrency.lockutils [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "7dedc0e6-e769-4fda-b465-152126c73743-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.035 182938 DEBUG oslo_concurrency.lockutils [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "7dedc0e6-e769-4fda-b465-152126c73743-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.035 182938 DEBUG oslo_concurrency.lockutils [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "7dedc0e6-e769-4fda-b465-152126c73743-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.036 182938 INFO nova.compute.manager [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Terminating instance
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.184 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.515 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.543 182938 DEBUG nova.compute.manager [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Jan 31 01:33:27 np0005603500 kernel: tap57786dbf-8d (unregistering): left promiscuous mode
Jan 31 01:33:27 np0005603500 NetworkManager[55506]: <info>  [1769841207.5742] device (tap57786dbf-8d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 01:33:27 np0005603500 ovn_controller[95398]: 2026-01-31T06:33:27Z|00057|binding|INFO|Releasing lport 57786dbf-8d90-4cbe-834d-3cd072a75d1f from this chassis (sb_readonly=0)
Jan 31 01:33:27 np0005603500 ovn_controller[95398]: 2026-01-31T06:33:27Z|00058|binding|INFO|Setting lport 57786dbf-8d90-4cbe-834d-3cd072a75d1f down in Southbound
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.576 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:27 np0005603500 ovn_controller[95398]: 2026-01-31T06:33:27Z|00059|binding|INFO|Removing iface tap57786dbf-8d ovn-installed in OVS
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.583 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.586 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:27.588 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dc:ad:4e 10.100.0.14'], port_security=['fa:16:3e:dc:ad:4e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7dedc0e6-e769-4fda-b465-152126c73743', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'b742a00d-33bc-4f25-9899-8560eae25dc3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=06664cb8-3ec3-4b12-9420-23c1bc38e360, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=57786dbf-8d90-4cbe-834d-3cd072a75d1f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:33:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:27.589 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 57786dbf-8d90-4cbe-834d-3cd072a75d1f in datapath 349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63 unbound from our chassis
Jan 31 01:33:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:27.591 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:33:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:27.592 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[7ab2a8e9-1746-4d3c-a89d-3794e96b1188]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:33:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:27.593 104644 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63 namespace which is not needed anymore
Jan 31 01:33:27 np0005603500 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Jan 31 01:33:27 np0005603500 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 16.861s CPU time.
Jan 31 01:33:27 np0005603500 systemd-machined[154375]: Machine qemu-1-instance-00000001 terminated.
Jan 31 01:33:27 np0005603500 podman[212730]: 2026-01-31 06:33:27.663249826 +0000 UTC m=+0.060082181 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 01:33:27 np0005603500 neutron-haproxy-ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63[211915]: [NOTICE]   (211919) : haproxy version is 2.8.14-c23fe91
Jan 31 01:33:27 np0005603500 podman[212772]: 2026-01-31 06:33:27.711732214 +0000 UTC m=+0.028779839 container kill fc52a7650b4b2873053b5c5a041c20128aff84ca701aab86939f53243ec18ca7 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 31 01:33:27 np0005603500 neutron-haproxy-ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63[211915]: [NOTICE]   (211919) : path to executable is /usr/sbin/haproxy
Jan 31 01:33:27 np0005603500 neutron-haproxy-ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63[211915]: [WARNING]  (211919) : Exiting Master process...
Jan 31 01:33:27 np0005603500 neutron-haproxy-ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63[211915]: [ALERT]    (211919) : Current worker (211921) exited with code 143 (Terminated)
Jan 31 01:33:27 np0005603500 neutron-haproxy-ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63[211915]: [WARNING]  (211919) : All workers exited. Exiting... (0)
Jan 31 01:33:27 np0005603500 systemd[1]: libpod-fc52a7650b4b2873053b5c5a041c20128aff84ca701aab86939f53243ec18ca7.scope: Deactivated successfully.
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.762 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.765 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.795 182938 INFO nova.virt.libvirt.driver [-] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Instance destroyed successfully.
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.796 182938 DEBUG nova.objects.instance [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'resources' on Instance uuid 7dedc0e6-e769-4fda-b465-152126c73743 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.892 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.965 182938 DEBUG nova.compute.manager [req-1b42a87f-f7b1-4ea8-a5c3-25495c5bdefc req-ed8b4f71-bbf1-4023-a125-2732445d38b8 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Received event network-vif-unplugged-57786dbf-8d90-4cbe-834d-3cd072a75d1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.966 182938 DEBUG oslo_concurrency.lockutils [req-1b42a87f-f7b1-4ea8-a5c3-25495c5bdefc req-ed8b4f71-bbf1-4023-a125-2732445d38b8 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "7dedc0e6-e769-4fda-b465-152126c73743-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.966 182938 DEBUG oslo_concurrency.lockutils [req-1b42a87f-f7b1-4ea8-a5c3-25495c5bdefc req-ed8b4f71-bbf1-4023-a125-2732445d38b8 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "7dedc0e6-e769-4fda-b465-152126c73743-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.966 182938 DEBUG oslo_concurrency.lockutils [req-1b42a87f-f7b1-4ea8-a5c3-25495c5bdefc req-ed8b4f71-bbf1-4023-a125-2732445d38b8 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "7dedc0e6-e769-4fda-b465-152126c73743-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.966 182938 DEBUG nova.compute.manager [req-1b42a87f-f7b1-4ea8-a5c3-25495c5bdefc req-ed8b4f71-bbf1-4023-a125-2732445d38b8 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] No waiting events found dispatching network-vif-unplugged-57786dbf-8d90-4cbe-834d-3cd072a75d1f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:33:27 np0005603500 nova_compute[182934]: 2026-01-31 06:33:27.966 182938 DEBUG nova.compute.manager [req-1b42a87f-f7b1-4ea8-a5c3-25495c5bdefc req-ed8b4f71-bbf1-4023-a125-2732445d38b8 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Received event network-vif-unplugged-57786dbf-8d90-4cbe-834d-3cd072a75d1f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.029 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.030 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:33:28 np0005603500 podman[212817]: 2026-01-31 06:33:28.172699902 +0000 UTC m=+0.022693446 container died fc52a7650b4b2873053b5c5a041c20128aff84ca701aab86939f53243ec18ca7 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true)
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.303 182938 DEBUG nova.virt.libvirt.vif [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:30:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-132386650',display_name='tempest-TestNetworkBasicOps-server-132386650',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-132386650',id=1,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGw41nQMJfT2kxXNRqmgCqTA3vngkKt8B3ulIHkgKcd42+FYSdC0j1jZchA3NNtcC9su1Z4mbyf3ZR6prbQi5Gh07jOCnjQDe+eIAPeL02ydcm3jjG1oX1Ppzv7y0nED0g==',key_name='tempest-TestNetworkBasicOps-78724422',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:31:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-r0f9rvd2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:31:28Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=7dedc0e6-e769-4fda-b465-152126c73743,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "address": "fa:16:3e:dc:ad:4e", "network": {"id": "349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63", "bridge": "br-int", "label": "tempest-network-smoke--1435648712", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57786dbf-8d", "ovs_interfaceid": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.304 182938 DEBUG nova.network.os_vif_util [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "address": "fa:16:3e:dc:ad:4e", "network": {"id": "349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63", "bridge": "br-int", "label": "tempest-network-smoke--1435648712", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57786dbf-8d", "ovs_interfaceid": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.305 182938 DEBUG nova.network.os_vif_util [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dc:ad:4e,bridge_name='br-int',has_traffic_filtering=True,id=57786dbf-8d90-4cbe-834d-3cd072a75d1f,network=Network(349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57786dbf-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.305 182938 DEBUG os_vif [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dc:ad:4e,bridge_name='br-int',has_traffic_filtering=True,id=57786dbf-8d90-4cbe-834d-3cd072a75d1f,network=Network(349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57786dbf-8d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.309 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.309 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57786dbf-8d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.310 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.312 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.313 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.314 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=4006599b-ba02-4e30-8a89-0d7fe1197cf3) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.315 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.316 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.317 182938 INFO os_vif [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dc:ad:4e,bridge_name='br-int',has_traffic_filtering=True,id=57786dbf-8d90-4cbe-834d-3cd072a75d1f,network=Network(349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57786dbf-8d')
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.318 182938 INFO nova.virt.libvirt.driver [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Deleting instance files /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743_del
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.318 182938 INFO nova.virt.libvirt.driver [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Deletion of /var/lib/nova/instances/7dedc0e6-e769-4fda-b465-152126c73743_del complete
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.832 182938 INFO nova.compute.manager [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Took 1.29 seconds to destroy the instance on the hypervisor.
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.832 182938 DEBUG oslo.service.backend.eventlet.loopingcall [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.833 182938 DEBUG nova.compute.manager [-] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Jan 31 01:33:28 np0005603500 nova_compute[182934]: 2026-01-31 06:33:28.833 182938 DEBUG nova.network.neutron [-] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Jan 31 01:33:29 np0005603500 nova_compute[182934]: 2026-01-31 06:33:29.029 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:33:29 np0005603500 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fc52a7650b4b2873053b5c5a041c20128aff84ca701aab86939f53243ec18ca7-userdata-shm.mount: Deactivated successfully.
Jan 31 01:33:29 np0005603500 systemd[1]: var-lib-containers-storage-overlay-e7e3c36394e0facde9cd6b4e232224ee7933535e67e7034a5664ad4d2127baea-merged.mount: Deactivated successfully.
Jan 31 01:33:29 np0005603500 nova_compute[182934]: 2026-01-31 06:33:29.542 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:33:29 np0005603500 nova_compute[182934]: 2026-01-31 06:33:29.543 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:33:29 np0005603500 nova_compute[182934]: 2026-01-31 06:33:29.543 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:33:29 np0005603500 nova_compute[182934]: 2026-01-31 06:33:29.543 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:33:29 np0005603500 nova_compute[182934]: 2026-01-31 06:33:29.544 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:33:29 np0005603500 nova_compute[182934]: 2026-01-31 06:33:29.544 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:33:29 np0005603500 podman[212817]: 2026-01-31 06:33:29.850187172 +0000 UTC m=+1.700180756 container remove fc52a7650b4b2873053b5c5a041c20128aff84ca701aab86939f53243ec18ca7 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:33:29 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:29.856 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[8f24d23b-6bcd-4a8c-a2b7-197f9da941c5]: (4, ("Sat Jan 31 06:33:27 AM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63 (fc52a7650b4b2873053b5c5a041c20128aff84ca701aab86939f53243ec18ca7)\nfc52a7650b4b2873053b5c5a041c20128aff84ca701aab86939f53243ec18ca7\nSat Jan 31 06:33:28 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63 (fc52a7650b4b2873053b5c5a041c20128aff84ca701aab86939f53243ec18ca7)\nfc52a7650b4b2873053b5c5a041c20128aff84ca701aab86939f53243ec18ca7\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:33:29 np0005603500 systemd[1]: libpod-conmon-fc52a7650b4b2873053b5c5a041c20128aff84ca701aab86939f53243ec18ca7.scope: Deactivated successfully.
Jan 31 01:33:29 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:29.858 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[a85449b3-c27e-49f9-b8fb-18ac43cb392c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:33:29 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:29.859 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:33:29 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:29.860 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[37bee435-6aab-4509-a350-04aad71f109a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:33:29 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:29.861 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap349f85d7-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:33:29 np0005603500 nova_compute[182934]: 2026-01-31 06:33:29.863 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:29 np0005603500 kernel: tap349f85d7-90: left promiscuous mode
Jan 31 01:33:29 np0005603500 nova_compute[182934]: 2026-01-31 06:33:29.868 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:29 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:29.872 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[3c400022-d10c-40f9-ab24-fdd8debfae18]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:33:29 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:29.891 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[efcb6b3c-e57e-4fe9-824b-4d6293e890da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:33:29 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:29.892 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[3045af6a-ea4e-4024-8a4f-e361e81ea272]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:33:29 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:29.911 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[09670cb5-e8d1-4034-ba43-6094c5b4dcfb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 339760, 'reachable_time': 32131, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212841, 'error': None, 'target': 'ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:33:29 np0005603500 systemd[1]: run-netns-ovnmeta\x2d349f85d7\x2d9f4e\x2d4c93\x2d9a9d\x2ddd8f72c1fe63.mount: Deactivated successfully.
Jan 31 01:33:29 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:29.913 105168 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 31 01:33:29 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:29.913 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[259c4173-7b07-4208-8676-d257776eae79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:33:29 np0005603500 podman[212833]: 2026-01-31 06:33:29.944436104 +0000 UTC m=+0.057667994 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:33:30 np0005603500 nova_compute[182934]: 2026-01-31 06:33:30.439 182938 DEBUG nova.compute.manager [req-5ed5faad-a762-47ff-89c8-270ce213812b req-b3d2666f-5d1f-4b53-965c-a28a8bf7e199 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Received event network-vif-plugged-57786dbf-8d90-4cbe-834d-3cd072a75d1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:33:30 np0005603500 nova_compute[182934]: 2026-01-31 06:33:30.439 182938 DEBUG oslo_concurrency.lockutils [req-5ed5faad-a762-47ff-89c8-270ce213812b req-b3d2666f-5d1f-4b53-965c-a28a8bf7e199 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "7dedc0e6-e769-4fda-b465-152126c73743-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:33:30 np0005603500 nova_compute[182934]: 2026-01-31 06:33:30.439 182938 DEBUG oslo_concurrency.lockutils [req-5ed5faad-a762-47ff-89c8-270ce213812b req-b3d2666f-5d1f-4b53-965c-a28a8bf7e199 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "7dedc0e6-e769-4fda-b465-152126c73743-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:33:30 np0005603500 nova_compute[182934]: 2026-01-31 06:33:30.440 182938 DEBUG oslo_concurrency.lockutils [req-5ed5faad-a762-47ff-89c8-270ce213812b req-b3d2666f-5d1f-4b53-965c-a28a8bf7e199 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "7dedc0e6-e769-4fda-b465-152126c73743-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:33:30 np0005603500 nova_compute[182934]: 2026-01-31 06:33:30.440 182938 DEBUG nova.compute.manager [req-5ed5faad-a762-47ff-89c8-270ce213812b req-b3d2666f-5d1f-4b53-965c-a28a8bf7e199 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] No waiting events found dispatching network-vif-plugged-57786dbf-8d90-4cbe-834d-3cd072a75d1f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:33:30 np0005603500 nova_compute[182934]: 2026-01-31 06:33:30.440 182938 WARNING nova.compute.manager [req-5ed5faad-a762-47ff-89c8-270ce213812b req-b3d2666f-5d1f-4b53-965c-a28a8bf7e199 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Received unexpected event network-vif-plugged-57786dbf-8d90-4cbe-834d-3cd072a75d1f for instance with vm_state active and task_state deleting.
Jan 31 01:33:30 np0005603500 nova_compute[182934]: 2026-01-31 06:33:30.658 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:33:31 np0005603500 nova_compute[182934]: 2026-01-31 06:33:31.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:33:31 np0005603500 nova_compute[182934]: 2026-01-31 06:33:31.248 182938 DEBUG nova.network.neutron [req-8358bd1e-4e17-44b4-a98f-a31c18a1b252 req-7b12ea24-fe04-4c4c-9467-9ceb627f0cf3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Updated VIF entry in instance network info cache for port 57786dbf-8d90-4cbe-834d-3cd072a75d1f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:33:31 np0005603500 nova_compute[182934]: 2026-01-31 06:33:31.249 182938 DEBUG nova.network.neutron [req-8358bd1e-4e17-44b4-a98f-a31c18a1b252 req-7b12ea24-fe04-4c4c-9467-9ceb627f0cf3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Updating instance_info_cache with network_info: [{"id": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "address": "fa:16:3e:dc:ad:4e", "network": {"id": "349f85d7-9f4e-4c93-9a9d-dd8f72c1fe63", "bridge": "br-int", "label": "tempest-network-smoke--1435648712", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57786dbf-8d", "ovs_interfaceid": "57786dbf-8d90-4cbe-834d-3cd072a75d1f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:33:31 np0005603500 nova_compute[182934]: 2026-01-31 06:33:31.689 182938 DEBUG nova.compute.manager [req-09a4a159-cbeb-4a24-9bc5-e0dcb674af2d req-b0c0c979-8fb6-4105-9118-663eba74ea32 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Received event network-vif-deleted-57786dbf-8d90-4cbe-834d-3cd072a75d1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:33:31 np0005603500 nova_compute[182934]: 2026-01-31 06:33:31.690 182938 INFO nova.compute.manager [req-09a4a159-cbeb-4a24-9bc5-e0dcb674af2d req-b0c0c979-8fb6-4105-9118-663eba74ea32 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Neutron deleted interface 57786dbf-8d90-4cbe-834d-3cd072a75d1f; detaching it from the instance and deleting it from the info cache
Jan 31 01:33:31 np0005603500 nova_compute[182934]: 2026-01-31 06:33:31.690 182938 DEBUG nova.network.neutron [req-09a4a159-cbeb-4a24-9bc5-e0dcb674af2d req-b0c0c979-8fb6-4105-9118-663eba74ea32 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:33:31 np0005603500 nova_compute[182934]: 2026-01-31 06:33:31.768 182938 DEBUG oslo_concurrency.lockutils [req-8358bd1e-4e17-44b4-a98f-a31c18a1b252 req-7b12ea24-fe04-4c4c-9467-9ceb627f0cf3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-7dedc0e6-e769-4fda-b465-152126c73743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:33:31 np0005603500 nova_compute[182934]: 2026-01-31 06:33:31.910 182938 DEBUG nova.network.neutron [-] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:33:32 np0005603500 nova_compute[182934]: 2026-01-31 06:33:32.198 182938 DEBUG nova.compute.manager [req-09a4a159-cbeb-4a24-9bc5-e0dcb674af2d req-b0c0c979-8fb6-4105-9118-663eba74ea32 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Detach interface failed, port_id=57786dbf-8d90-4cbe-834d-3cd072a75d1f, reason: Instance 7dedc0e6-e769-4fda-b465-152126c73743 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11571
Jan 31 01:33:32 np0005603500 nova_compute[182934]: 2026-01-31 06:33:32.419 182938 INFO nova.compute.manager [-] [instance: 7dedc0e6-e769-4fda-b465-152126c73743] Took 3.59 seconds to deallocate network for instance.
Jan 31 01:33:32 np0005603500 nova_compute[182934]: 2026-01-31 06:33:32.894 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:32 np0005603500 nova_compute[182934]: 2026-01-31 06:33:32.928 182938 DEBUG oslo_concurrency.lockutils [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:33:32 np0005603500 nova_compute[182934]: 2026-01-31 06:33:32.928 182938 DEBUG oslo_concurrency.lockutils [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:33:32 np0005603500 nova_compute[182934]: 2026-01-31 06:33:32.996 182938 DEBUG nova.compute.provider_tree [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:33:33 np0005603500 nova_compute[182934]: 2026-01-31 06:33:33.316 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:33 np0005603500 nova_compute[182934]: 2026-01-31 06:33:33.504 182938 DEBUG nova.scheduler.client.report [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:33:34 np0005603500 nova_compute[182934]: 2026-01-31 06:33:34.013 182938 DEBUG oslo_concurrency.lockutils [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:33:34 np0005603500 nova_compute[182934]: 2026-01-31 06:33:34.045 182938 INFO nova.scheduler.client.report [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Deleted allocations for instance 7dedc0e6-e769-4fda-b465-152126c73743
Jan 31 01:33:35 np0005603500 nova_compute[182934]: 2026-01-31 06:33:35.077 182938 DEBUG oslo_concurrency.lockutils [None req-ce54cf40-9070-49f7-8da6-ca5bced9dde7 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "7dedc0e6-e769-4fda-b465-152126c73743" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:33:37 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:37.168 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:33:37 np0005603500 nova_compute[182934]: 2026-01-31 06:33:37.168 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:37 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:37.170 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:33:37 np0005603500 nova_compute[182934]: 2026-01-31 06:33:37.895 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:38 np0005603500 nova_compute[182934]: 2026-01-31 06:33:38.318 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:42 np0005603500 podman[212863]: 2026-01-31 06:33:42.128091229 +0000 UTC m=+0.047054074 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true)
Jan 31 01:33:42 np0005603500 nova_compute[182934]: 2026-01-31 06:33:42.161 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:42 np0005603500 nova_compute[182934]: 2026-01-31 06:33:42.180 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:42 np0005603500 nova_compute[182934]: 2026-01-31 06:33:42.896 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:43 np0005603500 podman[212883]: 2026-01-31 06:33:43.14261899 +0000 UTC m=+0.059530392 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 31 01:33:43 np0005603500 nova_compute[182934]: 2026-01-31 06:33:43.320 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:46 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:46.171 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:33:47 np0005603500 nova_compute[182934]: 2026-01-31 06:33:47.898 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:48 np0005603500 nova_compute[182934]: 2026-01-31 06:33:48.323 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:50 np0005603500 podman[212907]: 2026-01-31 06:33:50.147695335 +0000 UTC m=+0.068799599 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, release=1769056855, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, architecture=x86_64, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 31 01:33:52 np0005603500 nova_compute[182934]: 2026-01-31 06:33:52.936 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:52 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:52.938 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:7c:da 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d9478fb7-5187-4733-899d-45464c14414d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9478fb7-5187-4733-899d-45464c14414d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9359aad-cbc6-45c4-a734-bba64ba33f13, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=759ae516-9997-4f0f-b500-4c1a16b6262f) old=Port_Binding(mac=['fa:16:3e:f7:7c:da'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-d9478fb7-5187-4733-899d-45464c14414d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9478fb7-5187-4733-899d-45464c14414d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:33:52 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:52.939 104644 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 759ae516-9997-4f0f-b500-4c1a16b6262f in datapath d9478fb7-5187-4733-899d-45464c14414d updated
Jan 31 01:33:52 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:52.940 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d9478fb7-5187-4733-899d-45464c14414d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:33:52 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:52.941 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[95448769-74da-4eb4-9ca1-59e3c96605ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:33:53 np0005603500 nova_compute[182934]: 2026-01-31 06:33:53.325 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:54 np0005603500 podman[212928]: 2026-01-31 06:33:54.172437225 +0000 UTC m=+0.083669105 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ovn_controller)
Jan 31 01:33:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:55.641 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:33:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:55.642 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:33:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:33:55.642 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:33:58 np0005603500 nova_compute[182934]: 2026-01-31 06:33:58.253 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:58 np0005603500 nova_compute[182934]: 2026-01-31 06:33:58.326 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:33:58 np0005603500 podman[212957]: 2026-01-31 06:33:58.335494174 +0000 UTC m=+0.059853113 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 01:34:00 np0005603500 podman[212978]: 2026-01-31 06:34:00.153195834 +0000 UTC m=+0.075264885 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 31 01:34:03 np0005603500 nova_compute[182934]: 2026-01-31 06:34:03.254 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:03 np0005603500 nova_compute[182934]: 2026-01-31 06:34:03.328 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:08 np0005603500 nova_compute[182934]: 2026-01-31 06:34:08.255 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:08 np0005603500 nova_compute[182934]: 2026-01-31 06:34:08.329 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:11 np0005603500 nova_compute[182934]: 2026-01-31 06:34:11.393 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "a94f97e8-6060-473c-92bc-75030c79b628" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:34:11 np0005603500 nova_compute[182934]: 2026-01-31 06:34:11.394 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:34:11 np0005603500 nova_compute[182934]: 2026-01-31 06:34:11.901 182938 DEBUG nova.compute.manager [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Jan 31 01:34:12 np0005603500 nova_compute[182934]: 2026-01-31 06:34:12.441 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:34:12 np0005603500 nova_compute[182934]: 2026-01-31 06:34:12.441 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:34:12 np0005603500 nova_compute[182934]: 2026-01-31 06:34:12.449 182938 DEBUG nova.virt.hardware [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Jan 31 01:34:12 np0005603500 nova_compute[182934]: 2026-01-31 06:34:12.450 182938 INFO nova.compute.claims [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Claim successful on node compute-0.ctlplane.example.com
Jan 31 01:34:13 np0005603500 podman[213002]: 2026-01-31 06:34:13.130267738 +0000 UTC m=+0.048102458 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 01:34:13 np0005603500 nova_compute[182934]: 2026-01-31 06:34:13.259 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:13 np0005603500 nova_compute[182934]: 2026-01-31 06:34:13.330 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:13 np0005603500 nova_compute[182934]: 2026-01-31 06:34:13.546 182938 DEBUG nova.compute.provider_tree [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:34:14 np0005603500 nova_compute[182934]: 2026-01-31 06:34:14.067 182938 DEBUG nova.scheduler.client.report [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:34:14 np0005603500 podman[213022]: 2026-01-31 06:34:14.157131673 +0000 UTC m=+0.057480017 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:34:14 np0005603500 nova_compute[182934]: 2026-01-31 06:34:14.584 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:34:14 np0005603500 nova_compute[182934]: 2026-01-31 06:34:14.584 182938 DEBUG nova.compute.manager [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Jan 31 01:34:15 np0005603500 nova_compute[182934]: 2026-01-31 06:34:15.095 182938 DEBUG nova.compute.manager [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Jan 31 01:34:15 np0005603500 nova_compute[182934]: 2026-01-31 06:34:15.096 182938 DEBUG nova.network.neutron [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Jan 31 01:34:15 np0005603500 nova_compute[182934]: 2026-01-31 06:34:15.604 182938 INFO nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 31 01:34:15 np0005603500 nova_compute[182934]: 2026-01-31 06:34:15.651 182938 DEBUG nova.policy [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '829310cd8381494e96216dba067ff8d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Jan 31 01:34:16 np0005603500 nova_compute[182934]: 2026-01-31 06:34:16.115 182938 DEBUG nova.compute.manager [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.133 182938 DEBUG nova.compute.manager [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.134 182938 DEBUG nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.135 182938 INFO nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Creating image(s)
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.135 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.136 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.136 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.137 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.141 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.144 182938 DEBUG oslo_concurrency.processutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.154 182938 DEBUG nova.network.neutron [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Successfully created port: d66f7017-2344-441d-9926-108c71a6b524 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.187 182938 DEBUG oslo_concurrency.processutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.188 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "d9035e96dc857b84194c2a2b496d294827e2de39" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.188 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.189 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.193 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.194 182938 DEBUG oslo_concurrency.processutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.244 182938 DEBUG oslo_concurrency.processutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.244 182938 DEBUG oslo_concurrency.processutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.269 182938 DEBUG oslo_concurrency.processutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.270 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.270 182938 DEBUG oslo_concurrency.processutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.321 182938 DEBUG oslo_concurrency.processutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.321 182938 DEBUG nova.virt.disk.api [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Checking if we can resize image /var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.322 182938 DEBUG oslo_concurrency.processutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.367 182938 DEBUG oslo_concurrency.processutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.367 182938 DEBUG nova.virt.disk.api [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Cannot resize image /var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.368 182938 DEBUG nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.369 182938 DEBUG nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Ensure instance console log exists: /var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.370 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.370 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:34:17 np0005603500 nova_compute[182934]: 2026-01-31 06:34:17.370 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.982 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f199f44dcd0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f19a53f3b80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f199f43b550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f199f44d6a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f199f44d3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f199f44d4f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f199f43bbe0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f199f44d040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f199f44db50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f199f44d220>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f199f44d760>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f199f44d940>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f199f43bdf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f199f43b0d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f199f44d2e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f199f44d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f199f44dc10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f199f451250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f199f43b3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f199f43b700>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f199f43b340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f199f436bb0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f199f43baf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f199f43bca0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f199f43b490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f199f44d160>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:34:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:34:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:34:18 np0005603500 nova_compute[182934]: 2026-01-31 06:34:18.260 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:18 np0005603500 nova_compute[182934]: 2026-01-31 06:34:18.376 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:18 np0005603500 nova_compute[182934]: 2026-01-31 06:34:18.669 182938 DEBUG nova.network.neutron [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Successfully updated port: d66f7017-2344-441d-9926-108c71a6b524 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 31 01:34:18 np0005603500 nova_compute[182934]: 2026-01-31 06:34:18.967 182938 DEBUG nova.compute.manager [req-f9b3607d-7014-4375-8f78-18bd0278de99 req-6eee249e-ebbf-4055-a8d5-8c608b261a6f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received event network-changed-d66f7017-2344-441d-9926-108c71a6b524 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:34:18 np0005603500 nova_compute[182934]: 2026-01-31 06:34:18.967 182938 DEBUG nova.compute.manager [req-f9b3607d-7014-4375-8f78-18bd0278de99 req-6eee249e-ebbf-4055-a8d5-8c608b261a6f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Refreshing instance network info cache due to event network-changed-d66f7017-2344-441d-9926-108c71a6b524. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:34:18 np0005603500 nova_compute[182934]: 2026-01-31 06:34:18.968 182938 DEBUG oslo_concurrency.lockutils [req-f9b3607d-7014-4375-8f78-18bd0278de99 req-6eee249e-ebbf-4055-a8d5-8c608b261a6f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:34:18 np0005603500 nova_compute[182934]: 2026-01-31 06:34:18.968 182938 DEBUG oslo_concurrency.lockutils [req-f9b3607d-7014-4375-8f78-18bd0278de99 req-6eee249e-ebbf-4055-a8d5-8c608b261a6f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:34:18 np0005603500 nova_compute[182934]: 2026-01-31 06:34:18.968 182938 DEBUG nova.network.neutron [req-f9b3607d-7014-4375-8f78-18bd0278de99 req-6eee249e-ebbf-4055-a8d5-8c608b261a6f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Refreshing network info cache for port d66f7017-2344-441d-9926-108c71a6b524 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:34:19 np0005603500 nova_compute[182934]: 2026-01-31 06:34:19.176 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:34:20 np0005603500 nova_compute[182934]: 2026-01-31 06:34:20.650 182938 DEBUG nova.network.neutron [req-f9b3607d-7014-4375-8f78-18bd0278de99 req-6eee249e-ebbf-4055-a8d5-8c608b261a6f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:34:21 np0005603500 podman[213061]: 2026-01-31 06:34:21.119536569 +0000 UTC m=+0.044923081 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, release=1769056855, version=9.7, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=openstack_network_exporter)
Jan 31 01:34:21 np0005603500 nova_compute[182934]: 2026-01-31 06:34:21.686 182938 DEBUG nova.network.neutron [req-f9b3607d-7014-4375-8f78-18bd0278de99 req-6eee249e-ebbf-4055-a8d5-8c608b261a6f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:34:23 np0005603500 nova_compute[182934]: 2026-01-31 06:34:23.261 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:23 np0005603500 nova_compute[182934]: 2026-01-31 06:34:23.378 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:23 np0005603500 nova_compute[182934]: 2026-01-31 06:34:23.705 182938 DEBUG oslo_concurrency.lockutils [req-f9b3607d-7014-4375-8f78-18bd0278de99 req-6eee249e-ebbf-4055-a8d5-8c608b261a6f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:34:23 np0005603500 nova_compute[182934]: 2026-01-31 06:34:23.706 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquired lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:34:23 np0005603500 nova_compute[182934]: 2026-01-31 06:34:23.706 182938 DEBUG nova.network.neutron [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Jan 31 01:34:24 np0005603500 ovn_controller[95398]: 2026-01-31T06:34:24Z|00060|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Jan 31 01:34:24 np0005603500 nova_compute[182934]: 2026-01-31 06:34:24.753 182938 DEBUG nova.network.neutron [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:34:25 np0005603500 podman[213084]: 2026-01-31 06:34:25.140856406 +0000 UTC m=+0.065983561 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 01:34:25 np0005603500 nova_compute[182934]: 2026-01-31 06:34:25.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:34:25 np0005603500 nova_compute[182934]: 2026-01-31 06:34:25.660 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:34:25 np0005603500 nova_compute[182934]: 2026-01-31 06:34:25.660 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:34:25 np0005603500 nova_compute[182934]: 2026-01-31 06:34:25.661 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:34:25 np0005603500 nova_compute[182934]: 2026-01-31 06:34:25.661 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:34:25 np0005603500 nova_compute[182934]: 2026-01-31 06:34:25.782 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:34:25 np0005603500 nova_compute[182934]: 2026-01-31 06:34:25.783 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5789MB free_disk=73.21562957763672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:34:25 np0005603500 nova_compute[182934]: 2026-01-31 06:34:25.783 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:34:25 np0005603500 nova_compute[182934]: 2026-01-31 06:34:25.784 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:34:26 np0005603500 nova_compute[182934]: 2026-01-31 06:34:26.875 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Instance a94f97e8-6060-473c-92bc-75030c79b628 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Jan 31 01:34:26 np0005603500 nova_compute[182934]: 2026-01-31 06:34:26.875 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:34:26 np0005603500 nova_compute[182934]: 2026-01-31 06:34:26.875 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:34:26 np0005603500 nova_compute[182934]: 2026-01-31 06:34:26.914 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:34:27 np0005603500 nova_compute[182934]: 2026-01-31 06:34:27.429 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:34:27 np0005603500 nova_compute[182934]: 2026-01-31 06:34:27.940 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:34:27 np0005603500 nova_compute[182934]: 2026-01-31 06:34:27.941 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:34:28 np0005603500 nova_compute[182934]: 2026-01-31 06:34:28.262 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:28 np0005603500 nova_compute[182934]: 2026-01-31 06:34:28.380 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:28 np0005603500 nova_compute[182934]: 2026-01-31 06:34:28.643 182938 DEBUG nova.network.neutron [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Updating instance_info_cache with network_info: [{"id": "d66f7017-2344-441d-9926-108c71a6b524", "address": "fa:16:3e:a3:78:c9", "network": {"id": "d9478fb7-5187-4733-899d-45464c14414d", "bridge": "br-int", "label": "tempest-network-smoke--1757708758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66f7017-23", "ovs_interfaceid": "d66f7017-2344-441d-9926-108c71a6b524", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:34:28 np0005603500 nova_compute[182934]: 2026-01-31 06:34:28.940 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:34:28 np0005603500 nova_compute[182934]: 2026-01-31 06:34:28.941 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:34:28 np0005603500 nova_compute[182934]: 2026-01-31 06:34:28.941 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:34:28 np0005603500 nova_compute[182934]: 2026-01-31 06:34:28.941 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:34:29 np0005603500 podman[213111]: 2026-01-31 06:34:29.128968452 +0000 UTC m=+0.051256182 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.148 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.204 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Releasing lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.205 182938 DEBUG nova.compute.manager [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Instance network_info: |[{"id": "d66f7017-2344-441d-9926-108c71a6b524", "address": "fa:16:3e:a3:78:c9", "network": {"id": "d9478fb7-5187-4733-899d-45464c14414d", "bridge": "br-int", "label": "tempest-network-smoke--1757708758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66f7017-23", "ovs_interfaceid": "d66f7017-2344-441d-9926-108c71a6b524", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.207 182938 DEBUG nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Start _get_guest_xml network_info=[{"id": "d66f7017-2344-441d-9926-108c71a6b524", "address": "fa:16:3e:a3:78:c9", "network": {"id": "d9478fb7-5187-4733-899d-45464c14414d", "bridge": "br-int", "label": "tempest-network-smoke--1757708758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66f7017-23", "ovs_interfaceid": "d66f7017-2344-441d-9926-108c71a6b524", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': '9f613975-b701-42a0-9b35-7d5c4a2cb7f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.210 182938 WARNING nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.211 182938 DEBUG nova.virt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-1745004988', uuid='a94f97e8-6060-473c-92bc-75030c79b628'), owner=OwnerMeta(userid='dddc34b0385a49a5bd9bf081ed29e9fd', username='tempest-TestNetworkBasicOps-1355800406-project-member', projectid='829310cd8381494e96216dba067ff8d3', projectname='tempest-TestNetworkBasicOps-1355800406'), image=ImageMeta(id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "d66f7017-2344-441d-9926-108c71a6b524", "address": "fa:16:3e:a3:78:c9", "network": {"id": "d9478fb7-5187-4733-899d-45464c14414d", "bridge": "br-int", "label": "tempest-network-smoke--1757708758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66f7017-23", "ovs_interfaceid": "d66f7017-2344-441d-9926-108c71a6b524", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1769841269.2110007) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.216 182938 DEBUG nova.virt.libvirt.host [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.217 182938 DEBUG nova.virt.libvirt.host [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.220 182938 DEBUG nova.virt.libvirt.host [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.221 182938 DEBUG nova.virt.libvirt.host [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.221 182938 DEBUG nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.221 182938 DEBUG nova.virt.hardware [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T06:29:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9956992e-a3ca-497f-9747-3ae270e07def',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.222 182938 DEBUG nova.virt.hardware [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.222 182938 DEBUG nova.virt.hardware [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.222 182938 DEBUG nova.virt.hardware [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.223 182938 DEBUG nova.virt.hardware [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.223 182938 DEBUG nova.virt.hardware [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.223 182938 DEBUG nova.virt.hardware [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.223 182938 DEBUG nova.virt.hardware [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.223 182938 DEBUG nova.virt.hardware [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.224 182938 DEBUG nova.virt.hardware [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.224 182938 DEBUG nova.virt.hardware [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.228 182938 DEBUG nova.virt.libvirt.vif [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:34:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1745004988',display_name='tempest-TestNetworkBasicOps-server-1745004988',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1745004988',id=3,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMI2O3WgANh2Ex39GfZpZtfGNn6Lqkeu95Vh4npQIMUyjsxGzg+vjYJnFgs6NBrTXoqH34OoaB7KMRljyOaVLfMxIzoP5XGG7G//VOeFqutEe9mXsK6+VunzjSElPnmqPg==',key_name='tempest-TestNetworkBasicOps-578042830',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-0ulgrsf7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:34:16Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=a94f97e8-6060-473c-92bc-75030c79b628,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d66f7017-2344-441d-9926-108c71a6b524", "address": "fa:16:3e:a3:78:c9", "network": {"id": "d9478fb7-5187-4733-899d-45464c14414d", "bridge": "br-int", "label": "tempest-network-smoke--1757708758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66f7017-23", "ovs_interfaceid": "d66f7017-2344-441d-9926-108c71a6b524", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.228 182938 DEBUG nova.network.os_vif_util [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "d66f7017-2344-441d-9926-108c71a6b524", "address": "fa:16:3e:a3:78:c9", "network": {"id": "d9478fb7-5187-4733-899d-45464c14414d", "bridge": "br-int", "label": "tempest-network-smoke--1757708758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66f7017-23", "ovs_interfaceid": "d66f7017-2344-441d-9926-108c71a6b524", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.229 182938 DEBUG nova.network.os_vif_util [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:78:c9,bridge_name='br-int',has_traffic_filtering=True,id=d66f7017-2344-441d-9926-108c71a6b524,network=Network(d9478fb7-5187-4733-899d-45464c14414d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd66f7017-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.230 182938 DEBUG nova.objects.instance [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid a94f97e8-6060-473c-92bc-75030c79b628 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.739 182938 DEBUG nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] End _get_guest_xml xml=<domain type="kvm">
Jan 31 01:34:29 np0005603500 nova_compute[182934]:  <uuid>a94f97e8-6060-473c-92bc-75030c79b628</uuid>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:  <name>instance-00000003</name>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:  <memory>131072</memory>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:  <vcpu>1</vcpu>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <nova:name>tempest-TestNetworkBasicOps-server-1745004988</nova:name>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <nova:creationTime>2026-01-31 06:34:29</nova:creationTime>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <nova:flavor name="m1.nano">
Jan 31 01:34:29 np0005603500 nova_compute[182934]:        <nova:memory>128</nova:memory>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:        <nova:disk>1</nova:disk>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:        <nova:swap>0</nova:swap>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:        <nova:vcpus>1</nova:vcpus>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      </nova:flavor>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <nova:owner>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:        <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:        <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      </nova:owner>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <nova:ports>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:        <nova:port uuid="d66f7017-2344-441d-9926-108c71a6b524">
Jan 31 01:34:29 np0005603500 nova_compute[182934]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:        </nova:port>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      </nova:ports>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    </nova:instance>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:  <sysinfo type="smbios">
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <entry name="manufacturer">RDO</entry>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <entry name="product">OpenStack Compute</entry>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <entry name="serial">a94f97e8-6060-473c-92bc-75030c79b628</entry>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <entry name="uuid">a94f97e8-6060-473c-92bc-75030c79b628</entry>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <entry name="family">Virtual Machine</entry>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <boot dev="hd"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <smbios mode="sysinfo"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <vmcoreinfo/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:  <clock offset="utc">
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <timer name="hpet" present="no"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:  <cpu mode="host-model" match="exact">
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <disk type="file" device="disk">
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <target dev="vda" bus="virtio"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <disk type="file" device="cdrom">
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <driver name="qemu" type="raw" cache="none"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk.config"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <target dev="sda" bus="sata"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <interface type="ethernet">
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <mac address="fa:16:3e:a3:78:c9"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <mtu size="1442"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <target dev="tapd66f7017-23"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <serial type="pty">
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <log file="/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/console.log" append="off"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <input type="tablet" bus="usb"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <rng model="virtio">
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <backend model="random">/dev/urandom</backend>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <controller type="usb" index="0"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    <memballoon model="virtio">
Jan 31 01:34:29 np0005603500 nova_compute[182934]:      <stats period="10"/>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:34:29 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:34:29 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:34:29 np0005603500 nova_compute[182934]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.740 182938 DEBUG nova.compute.manager [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Preparing to wait for external event network-vif-plugged-d66f7017-2344-441d-9926-108c71a6b524 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.741 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "a94f97e8-6060-473c-92bc-75030c79b628-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.741 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.741 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.742 182938 DEBUG nova.virt.libvirt.vif [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:34:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1745004988',display_name='tempest-TestNetworkBasicOps-server-1745004988',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1745004988',id=3,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMI2O3WgANh2Ex39GfZpZtfGNn6Lqkeu95Vh4npQIMUyjsxGzg+vjYJnFgs6NBrTXoqH34OoaB7KMRljyOaVLfMxIzoP5XGG7G//VOeFqutEe9mXsK6+VunzjSElPnmqPg==',key_name='tempest-TestNetworkBasicOps-578042830',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-0ulgrsf7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:34:16Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=a94f97e8-6060-473c-92bc-75030c79b628,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d66f7017-2344-441d-9926-108c71a6b524", "address": "fa:16:3e:a3:78:c9", "network": {"id": "d9478fb7-5187-4733-899d-45464c14414d", "bridge": "br-int", "label": "tempest-network-smoke--1757708758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66f7017-23", "ovs_interfaceid": "d66f7017-2344-441d-9926-108c71a6b524", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.742 182938 DEBUG nova.network.os_vif_util [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "d66f7017-2344-441d-9926-108c71a6b524", "address": "fa:16:3e:a3:78:c9", "network": {"id": "d9478fb7-5187-4733-899d-45464c14414d", "bridge": "br-int", "label": "tempest-network-smoke--1757708758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66f7017-23", "ovs_interfaceid": "d66f7017-2344-441d-9926-108c71a6b524", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.743 182938 DEBUG nova.network.os_vif_util [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:78:c9,bridge_name='br-int',has_traffic_filtering=True,id=d66f7017-2344-441d-9926-108c71a6b524,network=Network(d9478fb7-5187-4733-899d-45464c14414d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd66f7017-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.744 182938 DEBUG os_vif [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:78:c9,bridge_name='br-int',has_traffic_filtering=True,id=d66f7017-2344-441d-9926-108c71a6b524,network=Network(d9478fb7-5187-4733-899d-45464c14414d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd66f7017-23') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.744 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.745 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.745 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.746 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.746 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'a2ab0164-59a9-599d-ba65-6d4d8baf2f7e', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.748 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.750 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.750 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd66f7017-23, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.750 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapd66f7017-23, col_values=(('qos', UUID('9fd9bdb5-9f6b-490f-bb60-50946c9c306a')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.751 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapd66f7017-23, col_values=(('external_ids', {'iface-id': 'd66f7017-2344-441d-9926-108c71a6b524', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a3:78:c9', 'vm-uuid': 'a94f97e8-6060-473c-92bc-75030c79b628'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:34:29 np0005603500 NetworkManager[55506]: <info>  [1769841269.7530] manager: (tapd66f7017-23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.754 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.757 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:29 np0005603500 nova_compute[182934]: 2026-01-31 06:34:29.758 182938 INFO os_vif [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:78:c9,bridge_name='br-int',has_traffic_filtering=True,id=d66f7017-2344-441d-9926-108c71a6b524,network=Network(d9478fb7-5187-4733-899d-45464c14414d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd66f7017-23')
Jan 31 01:34:31 np0005603500 podman[213133]: 2026-01-31 06:34:31.116202619 +0000 UTC m=+0.039311852 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 31 01:34:31 np0005603500 nova_compute[182934]: 2026-01-31 06:34:31.143 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:34:31 np0005603500 nova_compute[182934]: 2026-01-31 06:34:31.295 182938 DEBUG nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:34:31 np0005603500 nova_compute[182934]: 2026-01-31 06:34:31.295 182938 DEBUG nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:34:31 np0005603500 nova_compute[182934]: 2026-01-31 06:34:31.295 182938 DEBUG nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No VIF found with MAC fa:16:3e:a3:78:c9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Jan 31 01:34:31 np0005603500 nova_compute[182934]: 2026-01-31 06:34:31.296 182938 INFO nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Using config drive
Jan 31 01:34:33 np0005603500 nova_compute[182934]: 2026-01-31 06:34:33.090 182938 INFO nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Creating config drive at /var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk.config
Jan 31 01:34:33 np0005603500 nova_compute[182934]: 2026-01-31 06:34:33.094 182938 DEBUG oslo_concurrency.processutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmp7agf6u66 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:34:33 np0005603500 nova_compute[182934]: 2026-01-31 06:34:33.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:34:33 np0005603500 nova_compute[182934]: 2026-01-31 06:34:33.208 182938 DEBUG oslo_concurrency.processutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmp7agf6u66" returned: 0 in 0.114s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:34:33 np0005603500 kernel: tapd66f7017-23: entered promiscuous mode
Jan 31 01:34:33 np0005603500 nova_compute[182934]: 2026-01-31 06:34:33.257 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:33 np0005603500 ovn_controller[95398]: 2026-01-31T06:34:33Z|00061|binding|INFO|Claiming lport d66f7017-2344-441d-9926-108c71a6b524 for this chassis.
Jan 31 01:34:33 np0005603500 ovn_controller[95398]: 2026-01-31T06:34:33Z|00062|binding|INFO|d66f7017-2344-441d-9926-108c71a6b524: Claiming fa:16:3e:a3:78:c9 10.100.0.7
Jan 31 01:34:33 np0005603500 NetworkManager[55506]: <info>  [1769841273.2599] manager: (tapd66f7017-23): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Jan 31 01:34:33 np0005603500 nova_compute[182934]: 2026-01-31 06:34:33.262 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:33 np0005603500 nova_compute[182934]: 2026-01-31 06:34:33.265 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.273 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:78:c9 10.100.0.7'], port_security=['fa:16:3e:a3:78:c9 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'a94f97e8-6060-473c-92bc-75030c79b628', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9478fb7-5187-4733-899d-45464c14414d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '19762263-34af-4bf7-9f2c-13801e872303', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9359aad-cbc6-45c4-a734-bba64ba33f13, chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=d66f7017-2344-441d-9926-108c71a6b524) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.274 104644 INFO neutron.agent.ovn.metadata.agent [-] Port d66f7017-2344-441d-9926-108c71a6b524 in datapath d9478fb7-5187-4733-899d-45464c14414d bound to our chassis
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.276 104644 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d9478fb7-5187-4733-899d-45464c14414d
Jan 31 01:34:33 np0005603500 ovn_controller[95398]: 2026-01-31T06:34:33Z|00063|binding|INFO|Setting lport d66f7017-2344-441d-9926-108c71a6b524 ovn-installed in OVS
Jan 31 01:34:33 np0005603500 ovn_controller[95398]: 2026-01-31T06:34:33Z|00064|binding|INFO|Setting lport d66f7017-2344-441d-9926-108c71a6b524 up in Southbound
Jan 31 01:34:33 np0005603500 nova_compute[182934]: 2026-01-31 06:34:33.281 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.285 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[a94e5afc-0f0d-4e09-8bb1-2b0bd84a89be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.286 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd9478fb7-51 in ovnmeta-d9478fb7-5187-4733-899d-45464c14414d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Jan 31 01:34:33 np0005603500 systemd-udevd[213177]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.288 210946 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd9478fb7-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.288 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[da51e159-0d40-498b-9ee0-764bc70f9e7e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.289 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[ecaa8c25-2d9f-4b7b-b09c-2327d8b4227f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:34:33 np0005603500 systemd-machined[154375]: New machine qemu-3-instance-00000003.
Jan 31 01:34:33 np0005603500 NetworkManager[55506]: <info>  [1769841273.2988] device (tapd66f7017-23): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:34:33 np0005603500 NetworkManager[55506]: <info>  [1769841273.2993] device (tapd66f7017-23): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.299 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[1ec88271-d057-4a60-9837-93ad135fe886]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:34:33 np0005603500 systemd[1]: Started Virtual Machine qemu-3-instance-00000003.
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.312 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[3f441e2a-f454-4f5e-ab83-687cd2b3d5f6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.335 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[5bb8d868-faee-497a-a95a-b809b029fd30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.340 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[f06eb5e0-69a0-4cf6-bba7-36567fbace53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:34:33 np0005603500 NetworkManager[55506]: <info>  [1769841273.3411] manager: (tapd9478fb7-50): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.367 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[e0fedbfe-8292-4852-abde-760236255071]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.370 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[c11a070d-20bd-4fcd-95c5-0fdb38794341]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:34:33 np0005603500 NetworkManager[55506]: <info>  [1769841273.3901] device (tapd9478fb7-50): carrier: link connected
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.394 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[8454b456-dc8b-43c3-8bf6-9b6bb694ac66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.409 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[55e6d536-6efb-4c28-b9be-c23669107145]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9478fb7-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:7c:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358468, 'reachable_time': 16380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213210, 'error': None, 'target': 'ovnmeta-d9478fb7-5187-4733-899d-45464c14414d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.423 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[2956a39c-236e-4e67-ab05-e1eb90535d0b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef7:7cda'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358468, 'tstamp': 358468}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213211, 'error': None, 'target': 'ovnmeta-d9478fb7-5187-4733-899d-45464c14414d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.435 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[cb0ffa1d-4660-4ade-b181-d9a3db04e8c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9478fb7-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:7c:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358468, 'reachable_time': 16380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213212, 'error': None, 'target': 'ovnmeta-d9478fb7-5187-4733-899d-45464c14414d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.457 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[b2e6dcfc-e7ca-4ec2-9dcf-0fed85d5934c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.497 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[f0d539b0-71d9-4253-afc6-9adb40b0af71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.498 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9478fb7-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.498 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.498 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd9478fb7-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:34:33 np0005603500 kernel: tapd9478fb7-50: entered promiscuous mode
Jan 31 01:34:33 np0005603500 nova_compute[182934]: 2026-01-31 06:34:33.500 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:33 np0005603500 NetworkManager[55506]: <info>  [1769841273.5008] manager: (tapd9478fb7-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.502 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd9478fb7-50, col_values=(('external_ids', {'iface-id': '759ae516-9997-4f0f-b500-4c1a16b6262f'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:34:33 np0005603500 nova_compute[182934]: 2026-01-31 06:34:33.503 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:33 np0005603500 ovn_controller[95398]: 2026-01-31T06:34:33Z|00065|binding|INFO|Releasing lport 759ae516-9997-4f0f-b500-4c1a16b6262f from this chassis (sb_readonly=0)
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.504 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[be0e0d61-3875-460c-a54f-c8881f0755e8]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.505 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9478fb7-5187-4733-899d-45464c14414d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9478fb7-5187-4733-899d-45464c14414d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.505 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9478fb7-5187-4733-899d-45464c14414d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9478fb7-5187-4733-899d-45464c14414d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.505 104644 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for d9478fb7-5187-4733-899d-45464c14414d disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.505 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9478fb7-5187-4733-899d-45464c14414d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9478fb7-5187-4733-899d-45464c14414d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.505 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[8a5268b0-5de3-485f-9102-88b9fc43c6e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.506 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9478fb7-5187-4733-899d-45464c14414d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9478fb7-5187-4733-899d-45464c14414d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.506 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[861c6711-4956-488b-a54e-87bde29926b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.506 104644 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: global
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    log         /dev/log local0 debug
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    log-tag     haproxy-metadata-proxy-d9478fb7-5187-4733-899d-45464c14414d
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    user        root
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    group       root
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    maxconn     1024
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    pidfile     /var/lib/neutron/external/pids/d9478fb7-5187-4733-899d-45464c14414d.pid.haproxy
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    daemon
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: defaults
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    log global
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    mode http
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    option httplog
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    option dontlognull
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    option http-server-close
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    option forwardfor
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    retries                 3
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    timeout http-request    30s
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    timeout connect         30s
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    timeout client          32s
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    timeout server          32s
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    timeout http-keep-alive 30s
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: listen listener
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    bind 169.254.169.254:80
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]:    http-request add-header X-OVN-Network-ID d9478fb7-5187-4733-899d-45464c14414d
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 31 01:34:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:33.506 104644 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d9478fb7-5187-4733-899d-45464c14414d', 'env', 'PROCESS_TAG=haproxy-d9478fb7-5187-4733-899d-45464c14414d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d9478fb7-5187-4733-899d-45464c14414d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Jan 31 01:34:33 np0005603500 nova_compute[182934]: 2026-01-31 06:34:33.508 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:33 np0005603500 podman[213244]: 2026-01-31 06:34:33.803664898 +0000 UTC m=+0.022389133 image pull d52ce0b189025039ce86fc9564595bcce243e95c598f912f021ea09cd4116a16 quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:34:33 np0005603500 nova_compute[182934]: 2026-01-31 06:34:33.915 182938 DEBUG nova.compute.manager [req-f7b56dec-e634-49cc-abab-4283e242cf7c req-7f774e2a-08f6-4569-bde4-d04a3d1ac7b0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received event network-vif-plugged-d66f7017-2344-441d-9926-108c71a6b524 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:34:33 np0005603500 nova_compute[182934]: 2026-01-31 06:34:33.916 182938 DEBUG oslo_concurrency.lockutils [req-f7b56dec-e634-49cc-abab-4283e242cf7c req-7f774e2a-08f6-4569-bde4-d04a3d1ac7b0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "a94f97e8-6060-473c-92bc-75030c79b628-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:34:33 np0005603500 nova_compute[182934]: 2026-01-31 06:34:33.917 182938 DEBUG oslo_concurrency.lockutils [req-f7b56dec-e634-49cc-abab-4283e242cf7c req-7f774e2a-08f6-4569-bde4-d04a3d1ac7b0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:34:33 np0005603500 nova_compute[182934]: 2026-01-31 06:34:33.917 182938 DEBUG oslo_concurrency.lockutils [req-f7b56dec-e634-49cc-abab-4283e242cf7c req-7f774e2a-08f6-4569-bde4-d04a3d1ac7b0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:34:33 np0005603500 nova_compute[182934]: 2026-01-31 06:34:33.917 182938 DEBUG nova.compute.manager [req-f7b56dec-e634-49cc-abab-4283e242cf7c req-7f774e2a-08f6-4569-bde4-d04a3d1ac7b0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Processing event network-vif-plugged-d66f7017-2344-441d-9926-108c71a6b524 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Jan 31 01:34:34 np0005603500 nova_compute[182934]: 2026-01-31 06:34:34.257 182938 DEBUG nova.compute.manager [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Jan 31 01:34:34 np0005603500 nova_compute[182934]: 2026-01-31 06:34:34.262 182938 DEBUG nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Jan 31 01:34:34 np0005603500 nova_compute[182934]: 2026-01-31 06:34:34.268 182938 INFO nova.virt.libvirt.driver [-] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Instance spawned successfully.
Jan 31 01:34:34 np0005603500 nova_compute[182934]: 2026-01-31 06:34:34.268 182938 DEBUG nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Jan 31 01:34:34 np0005603500 podman[213244]: 2026-01-31 06:34:34.445689119 +0000 UTC m=+0.664413334 container create 68d5e98e3a2ca8fe655a72d74ac2dc62a21fb6a938f3988dcdc16c88a14b355f (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-d9478fb7-5187-4733-899d-45464c14414d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 01:34:34 np0005603500 systemd[1]: Started libpod-conmon-68d5e98e3a2ca8fe655a72d74ac2dc62a21fb6a938f3988dcdc16c88a14b355f.scope.
Jan 31 01:34:34 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:34:34 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43d8a40676991a3232a4371b6dfea0e89a6bd5df7c6a8e57ce9669c72791d5d3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 01:34:34 np0005603500 podman[213244]: 2026-01-31 06:34:34.64645941 +0000 UTC m=+0.865183645 container init 68d5e98e3a2ca8fe655a72d74ac2dc62a21fb6a938f3988dcdc16c88a14b355f (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-d9478fb7-5187-4733-899d-45464c14414d, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:34:34 np0005603500 podman[213244]: 2026-01-31 06:34:34.650703976 +0000 UTC m=+0.869428191 container start 68d5e98e3a2ca8fe655a72d74ac2dc62a21fb6a938f3988dcdc16c88a14b355f (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-d9478fb7-5187-4733-899d-45464c14414d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 01:34:34 np0005603500 neutron-haproxy-ovnmeta-d9478fb7-5187-4733-899d-45464c14414d[213266]: [NOTICE]   (213270) : New worker (213272) forked
Jan 31 01:34:34 np0005603500 neutron-haproxy-ovnmeta-d9478fb7-5187-4733-899d-45464c14414d[213266]: [NOTICE]   (213270) : Loading success.
Jan 31 01:34:34 np0005603500 nova_compute[182934]: 2026-01-31 06:34:34.752 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:34 np0005603500 nova_compute[182934]: 2026-01-31 06:34:34.784 182938 DEBUG nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:34:34 np0005603500 nova_compute[182934]: 2026-01-31 06:34:34.784 182938 DEBUG nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:34:34 np0005603500 nova_compute[182934]: 2026-01-31 06:34:34.785 182938 DEBUG nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:34:34 np0005603500 nova_compute[182934]: 2026-01-31 06:34:34.785 182938 DEBUG nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:34:34 np0005603500 nova_compute[182934]: 2026-01-31 06:34:34.785 182938 DEBUG nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:34:34 np0005603500 nova_compute[182934]: 2026-01-31 06:34:34.786 182938 DEBUG nova.virt.libvirt.driver [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:34:35 np0005603500 nova_compute[182934]: 2026-01-31 06:34:35.295 182938 INFO nova.compute.manager [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Took 18.16 seconds to spawn the instance on the hypervisor.
Jan 31 01:34:35 np0005603500 nova_compute[182934]: 2026-01-31 06:34:35.296 182938 DEBUG nova.compute.manager [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Jan 31 01:34:35 np0005603500 nova_compute[182934]: 2026-01-31 06:34:35.866 182938 INFO nova.compute.manager [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Took 23.46 seconds to build instance.
Jan 31 01:34:36 np0005603500 nova_compute[182934]: 2026-01-31 06:34:36.297 182938 DEBUG nova.compute.manager [req-bb48d006-fcd3-43a5-bec9-349fcd9fd660 req-cd7e5b57-ced9-4ba8-bb32-fcf942e741f1 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received event network-vif-plugged-d66f7017-2344-441d-9926-108c71a6b524 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:34:36 np0005603500 nova_compute[182934]: 2026-01-31 06:34:36.297 182938 DEBUG oslo_concurrency.lockutils [req-bb48d006-fcd3-43a5-bec9-349fcd9fd660 req-cd7e5b57-ced9-4ba8-bb32-fcf942e741f1 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "a94f97e8-6060-473c-92bc-75030c79b628-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:34:36 np0005603500 nova_compute[182934]: 2026-01-31 06:34:36.298 182938 DEBUG oslo_concurrency.lockutils [req-bb48d006-fcd3-43a5-bec9-349fcd9fd660 req-cd7e5b57-ced9-4ba8-bb32-fcf942e741f1 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:34:36 np0005603500 nova_compute[182934]: 2026-01-31 06:34:36.298 182938 DEBUG oslo_concurrency.lockutils [req-bb48d006-fcd3-43a5-bec9-349fcd9fd660 req-cd7e5b57-ced9-4ba8-bb32-fcf942e741f1 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:34:36 np0005603500 nova_compute[182934]: 2026-01-31 06:34:36.298 182938 DEBUG nova.compute.manager [req-bb48d006-fcd3-43a5-bec9-349fcd9fd660 req-cd7e5b57-ced9-4ba8-bb32-fcf942e741f1 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] No waiting events found dispatching network-vif-plugged-d66f7017-2344-441d-9926-108c71a6b524 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:34:36 np0005603500 nova_compute[182934]: 2026-01-31 06:34:36.299 182938 WARNING nova.compute.manager [req-bb48d006-fcd3-43a5-bec9-349fcd9fd660 req-cd7e5b57-ced9-4ba8-bb32-fcf942e741f1 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received unexpected event network-vif-plugged-d66f7017-2344-441d-9926-108c71a6b524 for instance with vm_state active and task_state None.
Jan 31 01:34:36 np0005603500 nova_compute[182934]: 2026-01-31 06:34:36.372 182938 DEBUG oslo_concurrency.lockutils [None req-5706726a-8e17-4473-83f9-1fe670c52e39 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 24.978s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:34:38 np0005603500 nova_compute[182934]: 2026-01-31 06:34:38.266 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:39 np0005603500 nova_compute[182934]: 2026-01-31 06:34:39.755 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:43 np0005603500 nova_compute[182934]: 2026-01-31 06:34:43.267 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:44 np0005603500 podman[213281]: 2026-01-31 06:34:44.136293065 +0000 UTC m=+0.049534089 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 31 01:34:44 np0005603500 nova_compute[182934]: 2026-01-31 06:34:44.757 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:44 np0005603500 ovn_controller[95398]: 2026-01-31T06:34:44Z|00066|binding|INFO|Releasing lport 759ae516-9997-4f0f-b500-4c1a16b6262f from this chassis (sb_readonly=0)
Jan 31 01:34:44 np0005603500 nova_compute[182934]: 2026-01-31 06:34:44.772 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:44 np0005603500 NetworkManager[55506]: <info>  [1769841284.7768] manager: (patch-br-int-to-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Jan 31 01:34:44 np0005603500 NetworkManager[55506]: <info>  [1769841284.7774] manager: (patch-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Jan 31 01:34:44 np0005603500 ovn_controller[95398]: 2026-01-31T06:34:44Z|00067|binding|INFO|Releasing lport 759ae516-9997-4f0f-b500-4c1a16b6262f from this chassis (sb_readonly=0)
Jan 31 01:34:44 np0005603500 nova_compute[182934]: 2026-01-31 06:34:44.782 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:44 np0005603500 nova_compute[182934]: 2026-01-31 06:34:44.787 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:45 np0005603500 podman[213298]: 2026-01-31 06:34:45.130262089 +0000 UTC m=+0.049861129 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 31 01:34:45 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:45.705 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:34:45 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:45.707 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:34:45 np0005603500 nova_compute[182934]: 2026-01-31 06:34:45.748 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:45 np0005603500 nova_compute[182934]: 2026-01-31 06:34:45.992 182938 DEBUG nova.compute.manager [req-3d478b7b-329f-433e-ae64-8e0b736d17b8 req-2b609b0e-7467-4ece-a322-847e66c454b6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received event network-changed-d66f7017-2344-441d-9926-108c71a6b524 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:34:45 np0005603500 nova_compute[182934]: 2026-01-31 06:34:45.993 182938 DEBUG nova.compute.manager [req-3d478b7b-329f-433e-ae64-8e0b736d17b8 req-2b609b0e-7467-4ece-a322-847e66c454b6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Refreshing instance network info cache due to event network-changed-d66f7017-2344-441d-9926-108c71a6b524. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:34:45 np0005603500 nova_compute[182934]: 2026-01-31 06:34:45.993 182938 DEBUG oslo_concurrency.lockutils [req-3d478b7b-329f-433e-ae64-8e0b736d17b8 req-2b609b0e-7467-4ece-a322-847e66c454b6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:34:45 np0005603500 nova_compute[182934]: 2026-01-31 06:34:45.993 182938 DEBUG oslo_concurrency.lockutils [req-3d478b7b-329f-433e-ae64-8e0b736d17b8 req-2b609b0e-7467-4ece-a322-847e66c454b6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:34:45 np0005603500 nova_compute[182934]: 2026-01-31 06:34:45.993 182938 DEBUG nova.network.neutron [req-3d478b7b-329f-433e-ae64-8e0b736d17b8 req-2b609b0e-7467-4ece-a322-847e66c454b6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Refreshing network info cache for port d66f7017-2344-441d-9926-108c71a6b524 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:34:47 np0005603500 ovn_controller[95398]: 2026-01-31T06:34:47Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a3:78:c9 10.100.0.7
Jan 31 01:34:47 np0005603500 ovn_controller[95398]: 2026-01-31T06:34:47Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a3:78:c9 10.100.0.7
Jan 31 01:34:48 np0005603500 nova_compute[182934]: 2026-01-31 06:34:48.270 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:49 np0005603500 nova_compute[182934]: 2026-01-31 06:34:49.759 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:50 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:50.708 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:34:50 np0005603500 nova_compute[182934]: 2026-01-31 06:34:50.983 182938 DEBUG nova.network.neutron [req-3d478b7b-329f-433e-ae64-8e0b736d17b8 req-2b609b0e-7467-4ece-a322-847e66c454b6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Updated VIF entry in instance network info cache for port d66f7017-2344-441d-9926-108c71a6b524. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:34:50 np0005603500 nova_compute[182934]: 2026-01-31 06:34:50.984 182938 DEBUG nova.network.neutron [req-3d478b7b-329f-433e-ae64-8e0b736d17b8 req-2b609b0e-7467-4ece-a322-847e66c454b6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Updating instance_info_cache with network_info: [{"id": "d66f7017-2344-441d-9926-108c71a6b524", "address": "fa:16:3e:a3:78:c9", "network": {"id": "d9478fb7-5187-4733-899d-45464c14414d", "bridge": "br-int", "label": "tempest-network-smoke--1757708758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66f7017-23", "ovs_interfaceid": "d66f7017-2344-441d-9926-108c71a6b524", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:34:51 np0005603500 nova_compute[182934]: 2026-01-31 06:34:51.506 182938 DEBUG oslo_concurrency.lockutils [req-3d478b7b-329f-433e-ae64-8e0b736d17b8 req-2b609b0e-7467-4ece-a322-847e66c454b6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:34:52 np0005603500 podman[213341]: 2026-01-31 06:34:52.138988153 +0000 UTC m=+0.060666412 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, release=1769056855, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, build-date=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 31 01:34:52 np0005603500 nova_compute[182934]: 2026-01-31 06:34:52.416 182938 INFO nova.compute.manager [None req-1fe6fc25-128f-45b6-9411-07d17223441e dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Get console output
Jan 31 01:34:52 np0005603500 nova_compute[182934]: 2026-01-31 06:34:52.422 211654 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 31 01:34:53 np0005603500 nova_compute[182934]: 2026-01-31 06:34:53.272 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:54 np0005603500 nova_compute[182934]: 2026-01-31 06:34:54.760 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:55.692 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:34:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:55.693 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:34:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:55.693 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:34:55 np0005603500 podman[213363]: 2026-01-31 06:34:55.828142044 +0000 UTC m=+0.100782319 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 01:34:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:57.014 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:7a:39 10.100.0.17'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.17/28', 'neutron:device_id': 'ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e729c920-47a0-485f-a4f2-ea061c5a8a32', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03b073a9-8472-442a-9685-a489d08f03bf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f77d185d-08a3-47c0-aec4-ea60ea0b309c) old=Port_Binding(mac=['fa:16:3e:14:7a:39'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e729c920-47a0-485f-a4f2-ea061c5a8a32', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:34:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:57.015 104644 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f77d185d-08a3-47c0-aec4-ea60ea0b309c in datapath e729c920-47a0-485f-a4f2-ea061c5a8a32 updated
Jan 31 01:34:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:57.016 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e729c920-47a0-485f-a4f2-ea061c5a8a32, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:34:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:34:57.017 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[491eb988-1d08-4ada-8bd8-4619b74baaa0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:34:58 np0005603500 nova_compute[182934]: 2026-01-31 06:34:58.274 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:34:58 np0005603500 nova_compute[182934]: 2026-01-31 06:34:58.589 182938 DEBUG oslo_concurrency.lockutils [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "interface-a94f97e8-6060-473c-92bc-75030c79b628-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:34:58 np0005603500 nova_compute[182934]: 2026-01-31 06:34:58.590 182938 DEBUG oslo_concurrency.lockutils [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "interface-a94f97e8-6060-473c-92bc-75030c79b628-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:34:58 np0005603500 nova_compute[182934]: 2026-01-31 06:34:58.590 182938 DEBUG nova.objects.instance [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'flavor' on Instance uuid a94f97e8-6060-473c-92bc-75030c79b628 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:34:59 np0005603500 nova_compute[182934]: 2026-01-31 06:34:59.762 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:00 np0005603500 nova_compute[182934]: 2026-01-31 06:35:00.065 182938 DEBUG nova.objects.instance [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'pci_requests' on Instance uuid a94f97e8-6060-473c-92bc-75030c79b628 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:35:00 np0005603500 podman[213389]: 2026-01-31 06:35:00.141680174 +0000 UTC m=+0.058657539 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ceilometer_agent_compute)
Jan 31 01:35:00 np0005603500 nova_compute[182934]: 2026-01-31 06:35:00.575 182938 DEBUG nova.objects.base [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Object Instance<a94f97e8-6060-473c-92bc-75030c79b628> lazy-loaded attributes: flavor,pci_requests wrapper /usr/lib/python3.9/site-packages/nova/objects/base.py:136
Jan 31 01:35:00 np0005603500 nova_compute[182934]: 2026-01-31 06:35:00.576 182938 DEBUG nova.network.neutron [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Jan 31 01:35:01 np0005603500 nova_compute[182934]: 2026-01-31 06:35:01.204 182938 DEBUG nova.policy [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '829310cd8381494e96216dba067ff8d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Jan 31 01:35:02 np0005603500 podman[213410]: 2026-01-31 06:35:02.148568807 +0000 UTC m=+0.069544105 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 31 01:35:02 np0005603500 nova_compute[182934]: 2026-01-31 06:35:02.769 182938 DEBUG nova.network.neutron [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Successfully created port: 94ef64eb-5138-4961-ad73-1296ae99b4f1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 31 01:35:03 np0005603500 nova_compute[182934]: 2026-01-31 06:35:03.277 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:04 np0005603500 nova_compute[182934]: 2026-01-31 06:35:04.111 182938 DEBUG nova.network.neutron [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Successfully updated port: 94ef64eb-5138-4961-ad73-1296ae99b4f1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 31 01:35:04 np0005603500 nova_compute[182934]: 2026-01-31 06:35:04.428 182938 DEBUG nova.compute.manager [req-a31eedc0-7166-40d3-8ab0-4089876dbb3d req-10d8ddfd-1f8d-4ba0-bcf3-ba1b3ba88541 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received event network-changed-94ef64eb-5138-4961-ad73-1296ae99b4f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:35:04 np0005603500 nova_compute[182934]: 2026-01-31 06:35:04.428 182938 DEBUG nova.compute.manager [req-a31eedc0-7166-40d3-8ab0-4089876dbb3d req-10d8ddfd-1f8d-4ba0-bcf3-ba1b3ba88541 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Refreshing instance network info cache due to event network-changed-94ef64eb-5138-4961-ad73-1296ae99b4f1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:35:04 np0005603500 nova_compute[182934]: 2026-01-31 06:35:04.428 182938 DEBUG oslo_concurrency.lockutils [req-a31eedc0-7166-40d3-8ab0-4089876dbb3d req-10d8ddfd-1f8d-4ba0-bcf3-ba1b3ba88541 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:35:04 np0005603500 nova_compute[182934]: 2026-01-31 06:35:04.429 182938 DEBUG oslo_concurrency.lockutils [req-a31eedc0-7166-40d3-8ab0-4089876dbb3d req-10d8ddfd-1f8d-4ba0-bcf3-ba1b3ba88541 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:35:04 np0005603500 nova_compute[182934]: 2026-01-31 06:35:04.429 182938 DEBUG nova.network.neutron [req-a31eedc0-7166-40d3-8ab0-4089876dbb3d req-10d8ddfd-1f8d-4ba0-bcf3-ba1b3ba88541 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Refreshing network info cache for port 94ef64eb-5138-4961-ad73-1296ae99b4f1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:35:04 np0005603500 nova_compute[182934]: 2026-01-31 06:35:04.623 182938 DEBUG oslo_concurrency.lockutils [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:35:04 np0005603500 nova_compute[182934]: 2026-01-31 06:35:04.765 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:08 np0005603500 nova_compute[182934]: 2026-01-31 06:35:08.280 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:09 np0005603500 nova_compute[182934]: 2026-01-31 06:35:09.767 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:10 np0005603500 nova_compute[182934]: 2026-01-31 06:35:10.013 182938 DEBUG nova.network.neutron [req-a31eedc0-7166-40d3-8ab0-4089876dbb3d req-10d8ddfd-1f8d-4ba0-bcf3-ba1b3ba88541 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Added VIF to instance network info cache for port 94ef64eb-5138-4961-ad73-1296ae99b4f1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3546
Jan 31 01:35:10 np0005603500 nova_compute[182934]: 2026-01-31 06:35:10.014 182938 DEBUG nova.network.neutron [req-a31eedc0-7166-40d3-8ab0-4089876dbb3d req-10d8ddfd-1f8d-4ba0-bcf3-ba1b3ba88541 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Updating instance_info_cache with network_info: [{"id": "d66f7017-2344-441d-9926-108c71a6b524", "address": "fa:16:3e:a3:78:c9", "network": {"id": "d9478fb7-5187-4733-899d-45464c14414d", "bridge": "br-int", "label": "tempest-network-smoke--1757708758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66f7017-23", "ovs_interfaceid": "d66f7017-2344-441d-9926-108c71a6b524", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "address": "fa:16:3e:0b:28:e3", "network": {"id": "e729c920-47a0-485f-a4f2-ea061c5a8a32", "bridge": "br-int", "label": "tempest-network-smoke--1606482411", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94ef64eb-51", "ovs_interfaceid": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:35:10 np0005603500 nova_compute[182934]: 2026-01-31 06:35:10.521 182938 DEBUG oslo_concurrency.lockutils [req-a31eedc0-7166-40d3-8ab0-4089876dbb3d req-10d8ddfd-1f8d-4ba0-bcf3-ba1b3ba88541 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:35:10 np0005603500 nova_compute[182934]: 2026-01-31 06:35:10.521 182938 DEBUG oslo_concurrency.lockutils [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquired lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:35:10 np0005603500 nova_compute[182934]: 2026-01-31 06:35:10.521 182938 DEBUG nova.network.neutron [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Jan 31 01:35:11 np0005603500 nova_compute[182934]: 2026-01-31 06:35:11.653 182938 WARNING nova.network.neutron [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] e729c920-47a0-485f-a4f2-ea061c5a8a32 already exists in list: networks containing: ['e729c920-47a0-485f-a4f2-ea061c5a8a32']. ignoring it
Jan 31 01:35:11 np0005603500 nova_compute[182934]: 2026-01-31 06:35:11.654 182938 WARNING nova.network.neutron [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] 94ef64eb-5138-4961-ad73-1296ae99b4f1 already exists in list: port_ids containing: ['94ef64eb-5138-4961-ad73-1296ae99b4f1']. ignoring it
Jan 31 01:35:13 np0005603500 nova_compute[182934]: 2026-01-31 06:35:13.282 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:14 np0005603500 nova_compute[182934]: 2026-01-31 06:35:14.770 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:15 np0005603500 podman[213435]: 2026-01-31 06:35:15.14443513 +0000 UTC m=+0.067381126 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 31 01:35:15 np0005603500 podman[213454]: 2026-01-31 06:35:15.217263099 +0000 UTC m=+0.043818876 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.354 182938 DEBUG nova.network.neutron [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Updating instance_info_cache with network_info: [{"id": "d66f7017-2344-441d-9926-108c71a6b524", "address": "fa:16:3e:a3:78:c9", "network": {"id": "d9478fb7-5187-4733-899d-45464c14414d", "bridge": "br-int", "label": "tempest-network-smoke--1757708758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66f7017-23", "ovs_interfaceid": "d66f7017-2344-441d-9926-108c71a6b524", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "address": "fa:16:3e:0b:28:e3", "network": {"id": "e729c920-47a0-485f-a4f2-ea061c5a8a32", "bridge": "br-int", "label": "tempest-network-smoke--1606482411", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94ef64eb-51", "ovs_interfaceid": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.861 182938 DEBUG oslo_concurrency.lockutils [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Releasing lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.864 182938 DEBUG nova.virt.libvirt.vif [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:34:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1745004988',display_name='tempest-TestNetworkBasicOps-server-1745004988',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1745004988',id=3,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMI2O3WgANh2Ex39GfZpZtfGNn6Lqkeu95Vh4npQIMUyjsxGzg+vjYJnFgs6NBrTXoqH34OoaB7KMRljyOaVLfMxIzoP5XGG7G//VOeFqutEe9mXsK6+VunzjSElPnmqPg==',key_name='tempest-TestNetworkBasicOps-578042830',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:34:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-0ulgrsf7',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:34:35Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=a94f97e8-6060-473c-92bc-75030c79b628,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "address": "fa:16:3e:0b:28:e3", "network": {"id": "e729c920-47a0-485f-a4f2-ea061c5a8a32", "bridge": "br-int", "label": "tempest-network-smoke--1606482411", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94ef64eb-51", "ovs_interfaceid": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.865 182938 DEBUG nova.network.os_vif_util [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "address": "fa:16:3e:0b:28:e3", "network": {"id": "e729c920-47a0-485f-a4f2-ea061c5a8a32", "bridge": "br-int", "label": "tempest-network-smoke--1606482411", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94ef64eb-51", "ovs_interfaceid": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.865 182938 DEBUG nova.network.os_vif_util [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0b:28:e3,bridge_name='br-int',has_traffic_filtering=True,id=94ef64eb-5138-4961-ad73-1296ae99b4f1,network=Network(e729c920-47a0-485f-a4f2-ea061c5a8a32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94ef64eb-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.866 182938 DEBUG os_vif [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0b:28:e3,bridge_name='br-int',has_traffic_filtering=True,id=94ef64eb-5138-4961-ad73-1296ae99b4f1,network=Network(e729c920-47a0-485f-a4f2-ea061c5a8a32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94ef64eb-51') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.866 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.866 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.867 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.867 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.867 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '10b7fba9-add6-5ea3-8e64-7584c81f67e1', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.868 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.870 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.872 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.872 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap94ef64eb-51, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.872 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap94ef64eb-51, col_values=(('qos', UUID('c72d40ce-455f-4c58-a65d-174be8d761c4')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.872 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap94ef64eb-51, col_values=(('external_ids', {'iface-id': '94ef64eb-5138-4961-ad73-1296ae99b4f1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0b:28:e3', 'vm-uuid': 'a94f97e8-6060-473c-92bc-75030c79b628'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.873 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:17 np0005603500 NetworkManager[55506]: <info>  [1769841317.8744] manager: (tap94ef64eb-51): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.875 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.881 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.881 182938 INFO os_vif [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0b:28:e3,bridge_name='br-int',has_traffic_filtering=True,id=94ef64eb-5138-4961-ad73-1296ae99b4f1,network=Network(e729c920-47a0-485f-a4f2-ea061c5a8a32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94ef64eb-51')
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.882 182938 DEBUG nova.virt.libvirt.vif [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:34:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1745004988',display_name='tempest-TestNetworkBasicOps-server-1745004988',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1745004988',id=3,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMI2O3WgANh2Ex39GfZpZtfGNn6Lqkeu95Vh4npQIMUyjsxGzg+vjYJnFgs6NBrTXoqH34OoaB7KMRljyOaVLfMxIzoP5XGG7G//VOeFqutEe9mXsK6+VunzjSElPnmqPg==',key_name='tempest-TestNetworkBasicOps-578042830',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:34:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-0ulgrsf7',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:34:35Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=a94f97e8-6060-473c-92bc-75030c79b628,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "address": "fa:16:3e:0b:28:e3", "network": {"id": "e729c920-47a0-485f-a4f2-ea061c5a8a32", "bridge": "br-int", "label": "tempest-network-smoke--1606482411", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94ef64eb-51", "ovs_interfaceid": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.882 182938 DEBUG nova.network.os_vif_util [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "address": "fa:16:3e:0b:28:e3", "network": {"id": "e729c920-47a0-485f-a4f2-ea061c5a8a32", "bridge": "br-int", "label": "tempest-network-smoke--1606482411", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94ef64eb-51", "ovs_interfaceid": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.882 182938 DEBUG nova.network.os_vif_util [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0b:28:e3,bridge_name='br-int',has_traffic_filtering=True,id=94ef64eb-5138-4961-ad73-1296ae99b4f1,network=Network(e729c920-47a0-485f-a4f2-ea061c5a8a32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94ef64eb-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.885 182938 DEBUG nova.virt.libvirt.guest [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] attach device xml: <interface type="ethernet">
Jan 31 01:35:17 np0005603500 nova_compute[182934]:  <mac address="fa:16:3e:0b:28:e3"/>
Jan 31 01:35:17 np0005603500 nova_compute[182934]:  <model type="virtio"/>
Jan 31 01:35:17 np0005603500 nova_compute[182934]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 01:35:17 np0005603500 nova_compute[182934]:  <mtu size="1442"/>
Jan 31 01:35:17 np0005603500 nova_compute[182934]:  <target dev="tap94ef64eb-51"/>
Jan 31 01:35:17 np0005603500 nova_compute[182934]: </interface>
Jan 31 01:35:17 np0005603500 nova_compute[182934]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:336
Jan 31 01:35:17 np0005603500 kernel: tap94ef64eb-51: entered promiscuous mode
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.897 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:17 np0005603500 NetworkManager[55506]: <info>  [1769841317.8971] manager: (tap94ef64eb-51): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Jan 31 01:35:17 np0005603500 ovn_controller[95398]: 2026-01-31T06:35:17Z|00068|binding|INFO|Claiming lport 94ef64eb-5138-4961-ad73-1296ae99b4f1 for this chassis.
Jan 31 01:35:17 np0005603500 ovn_controller[95398]: 2026-01-31T06:35:17Z|00069|binding|INFO|94ef64eb-5138-4961-ad73-1296ae99b4f1: Claiming fa:16:3e:0b:28:e3 10.100.0.30
Jan 31 01:35:17 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:17.909 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:28:e3 10.100.0.30'], port_security=['fa:16:3e:0b:28:e3 10.100.0.30'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.30/28', 'neutron:device_id': 'a94f97e8-6060-473c-92bc-75030c79b628', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e729c920-47a0-485f-a4f2-ea061c5a8a32', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '859450e8-8706-4a8f-9018-d6c18f1c21cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03b073a9-8472-442a-9685-a489d08f03bf, chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=94ef64eb-5138-4961-ad73-1296ae99b4f1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:35:17 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:17.910 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 94ef64eb-5138-4961-ad73-1296ae99b4f1 in datapath e729c920-47a0-485f-a4f2-ea061c5a8a32 bound to our chassis
Jan 31 01:35:17 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:17.912 104644 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e729c920-47a0-485f-a4f2-ea061c5a8a32
Jan 31 01:35:17 np0005603500 ovn_controller[95398]: 2026-01-31T06:35:17Z|00070|binding|INFO|Setting lport 94ef64eb-5138-4961-ad73-1296ae99b4f1 ovn-installed in OVS
Jan 31 01:35:17 np0005603500 ovn_controller[95398]: 2026-01-31T06:35:17Z|00071|binding|INFO|Setting lport 94ef64eb-5138-4961-ad73-1296ae99b4f1 up in Southbound
Jan 31 01:35:17 np0005603500 systemd-udevd[213484]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:35:17 np0005603500 NetworkManager[55506]: <info>  [1769841317.9391] device (tap94ef64eb-51): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:35:17 np0005603500 NetworkManager[55506]: <info>  [1769841317.9400] device (tap94ef64eb-51): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 01:35:17 np0005603500 nova_compute[182934]: 2026-01-31 06:35:17.948 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:17 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:17.958 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[192ac2e2-6ff5-48a0-ba41-e7e3ec699246]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:17 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:17.959 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape729c920-41 in ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Jan 31 01:35:17 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:17.962 210946 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape729c920-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 31 01:35:17 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:17.963 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[33d8dc93-be8c-47dc-a59d-fb6126ab9788]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:17 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:17.963 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[6d2f31f8-398a-44a2-8b28-4e66f545f621]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:17 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:17.971 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[90a62ccd-eaa5-4b7e-9077-99db61ff42e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:17 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:17.982 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[10fb0d39-bf7d-4eaf-be28-e00406680633]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.005 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[60f1450d-d316-493f-90d2-a1d1adf9f7df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:18 np0005603500 NetworkManager[55506]: <info>  [1769841318.0124] manager: (tape729c920-40): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.013 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[8fac0f6f-176e-45a7-a8bb-92bc161a461f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:18 np0005603500 systemd-udevd[213486]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.034 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[1645a729-e28e-44ac-b369-7d4bbccc39fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.037 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[a652172a-e9a4-4bdf-8f44-c38fb6e91ddf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:18 np0005603500 NetworkManager[55506]: <info>  [1769841318.0540] device (tape729c920-40): carrier: link connected
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.058 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[9f728863-4a54-4a55-bc6e-b491531fda45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.070 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[991bc8e5-401e-4fd3-b183-cd0f4b7ea795]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape729c920-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:14:7a:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362935, 'reachable_time': 34911, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213514, 'error': None, 'target': 'ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.082 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[7164b3dc-2b2c-4298-a1aa-1182c73c912f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe14:7a39'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 362935, 'tstamp': 362935}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213515, 'error': None, 'target': 'ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.097 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[29089f87-69bf-4837-9b8f-c1cfe5a12b3a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape729c920-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:14:7a:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362935, 'reachable_time': 34911, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213516, 'error': None, 'target': 'ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.120 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[34e88ae3-2288-40c2-aacd-0b61a2dd28dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.166 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[33276786-a995-4339-8d21-ddbc83c0e661]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.167 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape729c920-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.167 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.168 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape729c920-40, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:35:18 np0005603500 nova_compute[182934]: 2026-01-31 06:35:18.169 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:18 np0005603500 kernel: tape729c920-40: entered promiscuous mode
Jan 31 01:35:18 np0005603500 NetworkManager[55506]: <info>  [1769841318.1706] manager: (tape729c920-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Jan 31 01:35:18 np0005603500 nova_compute[182934]: 2026-01-31 06:35:18.171 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.172 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape729c920-40, col_values=(('external_ids', {'iface-id': 'f77d185d-08a3-47c0-aec4-ea60ea0b309c'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:35:18 np0005603500 nova_compute[182934]: 2026-01-31 06:35:18.173 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:18 np0005603500 ovn_controller[95398]: 2026-01-31T06:35:18Z|00072|binding|INFO|Releasing lport f77d185d-08a3-47c0-aec4-ea60ea0b309c from this chassis (sb_readonly=0)
Jan 31 01:35:18 np0005603500 nova_compute[182934]: 2026-01-31 06:35:18.174 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.175 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[d56c1ec2-404d-4b08-a5b2-7922bb8fc038]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.176 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e729c920-47a0-485f-a4f2-ea061c5a8a32.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e729c920-47a0-485f-a4f2-ea061c5a8a32.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.176 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e729c920-47a0-485f-a4f2-ea061c5a8a32.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e729c920-47a0-485f-a4f2-ea061c5a8a32.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.176 104644 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for e729c920-47a0-485f-a4f2-ea061c5a8a32 disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.177 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e729c920-47a0-485f-a4f2-ea061c5a8a32.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e729c920-47a0-485f-a4f2-ea061c5a8a32.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:35:18 np0005603500 nova_compute[182934]: 2026-01-31 06:35:18.177 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.177 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[82e2c321-0678-4a26-a317-b2dd7ee90103]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.178 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e729c920-47a0-485f-a4f2-ea061c5a8a32.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e729c920-47a0-485f-a4f2-ea061c5a8a32.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.178 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[46c1ad06-34cc-40df-b1f3-e106f9fde465]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.179 104644 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: global
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    log         /dev/log local0 debug
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    log-tag     haproxy-metadata-proxy-e729c920-47a0-485f-a4f2-ea061c5a8a32
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    user        root
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    group       root
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    maxconn     1024
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    pidfile     /var/lib/neutron/external/pids/e729c920-47a0-485f-a4f2-ea061c5a8a32.pid.haproxy
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    daemon
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: defaults
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    log global
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    mode http
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    option httplog
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    option dontlognull
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    option http-server-close
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    option forwardfor
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    retries                 3
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    timeout http-request    30s
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    timeout connect         30s
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    timeout client          32s
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    timeout server          32s
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    timeout http-keep-alive 30s
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: listen listener
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    bind 169.254.169.254:80
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]:    http-request add-header X-OVN-Network-ID e729c920-47a0-485f-a4f2-ea061c5a8a32
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 31 01:35:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:18.180 104644 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32', 'env', 'PROCESS_TAG=haproxy-e729c920-47a0-485f-a4f2-ea061c5a8a32', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e729c920-47a0-485f-a4f2-ea061c5a8a32.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Jan 31 01:35:18 np0005603500 nova_compute[182934]: 2026-01-31 06:35:18.262 182938 DEBUG nova.compute.manager [req-e274d305-eb01-4cb0-9847-d53891cdf244 req-653755bd-91ff-4917-aff4-4192fb65f896 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received event network-vif-plugged-94ef64eb-5138-4961-ad73-1296ae99b4f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:35:18 np0005603500 nova_compute[182934]: 2026-01-31 06:35:18.262 182938 DEBUG oslo_concurrency.lockutils [req-e274d305-eb01-4cb0-9847-d53891cdf244 req-653755bd-91ff-4917-aff4-4192fb65f896 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "a94f97e8-6060-473c-92bc-75030c79b628-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:35:18 np0005603500 nova_compute[182934]: 2026-01-31 06:35:18.263 182938 DEBUG oslo_concurrency.lockutils [req-e274d305-eb01-4cb0-9847-d53891cdf244 req-653755bd-91ff-4917-aff4-4192fb65f896 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:35:18 np0005603500 nova_compute[182934]: 2026-01-31 06:35:18.263 182938 DEBUG oslo_concurrency.lockutils [req-e274d305-eb01-4cb0-9847-d53891cdf244 req-653755bd-91ff-4917-aff4-4192fb65f896 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:35:18 np0005603500 nova_compute[182934]: 2026-01-31 06:35:18.263 182938 DEBUG nova.compute.manager [req-e274d305-eb01-4cb0-9847-d53891cdf244 req-653755bd-91ff-4917-aff4-4192fb65f896 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] No waiting events found dispatching network-vif-plugged-94ef64eb-5138-4961-ad73-1296ae99b4f1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:35:18 np0005603500 nova_compute[182934]: 2026-01-31 06:35:18.263 182938 WARNING nova.compute.manager [req-e274d305-eb01-4cb0-9847-d53891cdf244 req-653755bd-91ff-4917-aff4-4192fb65f896 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received unexpected event network-vif-plugged-94ef64eb-5138-4961-ad73-1296ae99b4f1 for instance with vm_state active and task_state None.
Jan 31 01:35:18 np0005603500 nova_compute[182934]: 2026-01-31 06:35:18.283 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:18 np0005603500 podman[213548]: 2026-01-31 06:35:18.535532192 +0000 UTC m=+0.070992891 container create f3f85e98204668f3b6c1ba589c6727c1ad19e66be515590b04ae310085f67e3e (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 01:35:18 np0005603500 systemd[1]: Started libpod-conmon-f3f85e98204668f3b6c1ba589c6727c1ad19e66be515590b04ae310085f67e3e.scope.
Jan 31 01:35:18 np0005603500 podman[213548]: 2026-01-31 06:35:18.483742763 +0000 UTC m=+0.019203492 image pull d52ce0b189025039ce86fc9564595bcce243e95c598f912f021ea09cd4116a16 quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:35:18 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:35:18 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4aedcbfd3862c81d1aaa6139a48f17f3b4e8d87ef5c06b0c0e4b019097f1ff77/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 01:35:18 np0005603500 podman[213548]: 2026-01-31 06:35:18.619425602 +0000 UTC m=+0.154886322 container init f3f85e98204668f3b6c1ba589c6727c1ad19e66be515590b04ae310085f67e3e (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Jan 31 01:35:18 np0005603500 podman[213548]: 2026-01-31 06:35:18.624962839 +0000 UTC m=+0.160423548 container start f3f85e98204668f3b6c1ba589c6727c1ad19e66be515590b04ae310085f67e3e (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 01:35:18 np0005603500 neutron-haproxy-ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32[213563]: [NOTICE]   (213567) : New worker (213569) forked
Jan 31 01:35:18 np0005603500 neutron-haproxy-ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32[213563]: [NOTICE]   (213567) : Loading success.
Jan 31 01:35:19 np0005603500 ovn_controller[95398]: 2026-01-31T06:35:19Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0b:28:e3 10.100.0.30
Jan 31 01:35:19 np0005603500 ovn_controller[95398]: 2026-01-31T06:35:19Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0b:28:e3 10.100.0.30
Jan 31 01:35:19 np0005603500 nova_compute[182934]: 2026-01-31 06:35:19.542 182938 DEBUG nova.virt.libvirt.driver [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:35:19 np0005603500 nova_compute[182934]: 2026-01-31 06:35:19.542 182938 DEBUG nova.virt.libvirt.driver [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:35:19 np0005603500 nova_compute[182934]: 2026-01-31 06:35:19.542 182938 DEBUG nova.virt.libvirt.driver [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No VIF found with MAC fa:16:3e:a3:78:c9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Jan 31 01:35:19 np0005603500 nova_compute[182934]: 2026-01-31 06:35:19.542 182938 DEBUG nova.virt.libvirt.driver [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No VIF found with MAC fa:16:3e:0b:28:e3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Jan 31 01:35:20 np0005603500 nova_compute[182934]: 2026-01-31 06:35:20.056 182938 DEBUG nova.virt.driver [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-1745004988', uuid='a94f97e8-6060-473c-92bc-75030c79b628'), owner=OwnerMeta(userid='dddc34b0385a49a5bd9bf081ed29e9fd', username='tempest-TestNetworkBasicOps-1355800406-project-member', projectid='829310cd8381494e96216dba067ff8d3', projectname='tempest-TestNetworkBasicOps-1355800406'), image=ImageMeta(id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus='sata',hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus='virtio',hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus='usb',hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type='q35',hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model='usbtablet',hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model='virtio',hw_video_ram=<?>,hw_vif_model='virtio',hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "d66f7017-2344-441d-9926-108c71a6b524", "address": "fa:16:3e:a3:78:c9", "network": {"id": "d9478fb7-5187-4733-899d-45464c14414d", "bridge": "br-int", "label": "tempest-network-smoke--1757708758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66f7017-23", "ovs_interfaceid": "d66f7017-2344-441d-9926-108c71a6b524", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "address": "fa:16:3e:0b:28:e3", "network": {"id": "e729c920-47a0-485f-a4f2-ea061c5a8a32", "bridge": "br-int", "label": "tempest-network-smoke--1606482411", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94ef64eb-51", "ovs_interfaceid": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1769841320.0562575) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Jan 31 01:35:20 np0005603500 nova_compute[182934]: 2026-01-31 06:35:20.057 182938 DEBUG nova.virt.libvirt.guest [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:35:20 np0005603500 nova_compute[182934]:  <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:35:20 np0005603500 nova_compute[182934]:  <nova:name>tempest-TestNetworkBasicOps-server-1745004988</nova:name>
Jan 31 01:35:20 np0005603500 nova_compute[182934]:  <nova:creationTime>2026-01-31 06:35:20</nova:creationTime>
Jan 31 01:35:20 np0005603500 nova_compute[182934]:  <nova:flavor name="m1.nano">
Jan 31 01:35:20 np0005603500 nova_compute[182934]:    <nova:memory>128</nova:memory>
Jan 31 01:35:20 np0005603500 nova_compute[182934]:    <nova:disk>1</nova:disk>
Jan 31 01:35:20 np0005603500 nova_compute[182934]:    <nova:swap>0</nova:swap>
Jan 31 01:35:20 np0005603500 nova_compute[182934]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:35:20 np0005603500 nova_compute[182934]:    <nova:vcpus>1</nova:vcpus>
Jan 31 01:35:20 np0005603500 nova_compute[182934]:  </nova:flavor>
Jan 31 01:35:20 np0005603500 nova_compute[182934]:  <nova:owner>
Jan 31 01:35:20 np0005603500 nova_compute[182934]:    <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:35:20 np0005603500 nova_compute[182934]:    <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:35:20 np0005603500 nova_compute[182934]:  </nova:owner>
Jan 31 01:35:20 np0005603500 nova_compute[182934]:  <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:35:20 np0005603500 nova_compute[182934]:  <nova:ports>
Jan 31 01:35:20 np0005603500 nova_compute[182934]:    <nova:port uuid="d66f7017-2344-441d-9926-108c71a6b524">
Jan 31 01:35:20 np0005603500 nova_compute[182934]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 01:35:20 np0005603500 nova_compute[182934]:    </nova:port>
Jan 31 01:35:20 np0005603500 nova_compute[182934]:    <nova:port uuid="94ef64eb-5138-4961-ad73-1296ae99b4f1">
Jan 31 01:35:20 np0005603500 nova_compute[182934]:      <nova:ip type="fixed" address="10.100.0.30" ipVersion="4"/>
Jan 31 01:35:20 np0005603500 nova_compute[182934]:    </nova:port>
Jan 31 01:35:20 np0005603500 nova_compute[182934]:  </nova:ports>
Jan 31 01:35:20 np0005603500 nova_compute[182934]: </nova:instance>
Jan 31 01:35:20 np0005603500 nova_compute[182934]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:356
Jan 31 01:35:20 np0005603500 nova_compute[182934]: 2026-01-31 06:35:20.463 182938 DEBUG nova.compute.manager [req-75d4676f-032f-4df6-bc60-f680fb331cfd req-6b47b018-c83d-451a-ac56-e3a488a3b62f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received event network-vif-plugged-94ef64eb-5138-4961-ad73-1296ae99b4f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:35:20 np0005603500 nova_compute[182934]: 2026-01-31 06:35:20.463 182938 DEBUG oslo_concurrency.lockutils [req-75d4676f-032f-4df6-bc60-f680fb331cfd req-6b47b018-c83d-451a-ac56-e3a488a3b62f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "a94f97e8-6060-473c-92bc-75030c79b628-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:35:20 np0005603500 nova_compute[182934]: 2026-01-31 06:35:20.464 182938 DEBUG oslo_concurrency.lockutils [req-75d4676f-032f-4df6-bc60-f680fb331cfd req-6b47b018-c83d-451a-ac56-e3a488a3b62f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:35:20 np0005603500 nova_compute[182934]: 2026-01-31 06:35:20.464 182938 DEBUG oslo_concurrency.lockutils [req-75d4676f-032f-4df6-bc60-f680fb331cfd req-6b47b018-c83d-451a-ac56-e3a488a3b62f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:35:20 np0005603500 nova_compute[182934]: 2026-01-31 06:35:20.464 182938 DEBUG nova.compute.manager [req-75d4676f-032f-4df6-bc60-f680fb331cfd req-6b47b018-c83d-451a-ac56-e3a488a3b62f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] No waiting events found dispatching network-vif-plugged-94ef64eb-5138-4961-ad73-1296ae99b4f1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:35:20 np0005603500 nova_compute[182934]: 2026-01-31 06:35:20.464 182938 WARNING nova.compute.manager [req-75d4676f-032f-4df6-bc60-f680fb331cfd req-6b47b018-c83d-451a-ac56-e3a488a3b62f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received unexpected event network-vif-plugged-94ef64eb-5138-4961-ad73-1296ae99b4f1 for instance with vm_state active and task_state None.
Jan 31 01:35:20 np0005603500 nova_compute[182934]: 2026-01-31 06:35:20.572 182938 DEBUG oslo_concurrency.lockutils [None req-3ff880dd-3553-4c73-82bd-ab47808fa8f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "interface-a94f97e8-6060-473c-92bc-75030c79b628-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 21.983s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:35:22 np0005603500 nova_compute[182934]: 2026-01-31 06:35:22.874 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:23 np0005603500 podman[213578]: 2026-01-31 06:35:23.140858779 +0000 UTC m=+0.058963588 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9/ubi-minimal, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container)
Jan 31 01:35:23 np0005603500 nova_compute[182934]: 2026-01-31 06:35:23.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:35:23 np0005603500 nova_compute[182934]: 2026-01-31 06:35:23.285 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:23 np0005603500 nova_compute[182934]: 2026-01-31 06:35:23.515 182938 DEBUG oslo_concurrency.lockutils [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "interface-a94f97e8-6060-473c-92bc-75030c79b628-94ef64eb-5138-4961-ad73-1296ae99b4f1" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:35:23 np0005603500 nova_compute[182934]: 2026-01-31 06:35:23.516 182938 DEBUG oslo_concurrency.lockutils [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "interface-a94f97e8-6060-473c-92bc-75030c79b628-94ef64eb-5138-4961-ad73-1296ae99b4f1" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:35:24 np0005603500 nova_compute[182934]: 2026-01-31 06:35:24.025 182938 DEBUG nova.objects.instance [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'flavor' on Instance uuid a94f97e8-6060-473c-92bc-75030c79b628 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:35:24 np0005603500 nova_compute[182934]: 2026-01-31 06:35:24.533 182938 DEBUG nova.virt.libvirt.vif [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:34:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1745004988',display_name='tempest-TestNetworkBasicOps-server-1745004988',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1745004988',id=3,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMI2O3WgANh2Ex39GfZpZtfGNn6Lqkeu95Vh4npQIMUyjsxGzg+vjYJnFgs6NBrTXoqH34OoaB7KMRljyOaVLfMxIzoP5XGG7G//VOeFqutEe9mXsK6+VunzjSElPnmqPg==',key_name='tempest-TestNetworkBasicOps-578042830',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:34:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-0ulgrsf7',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:34:35Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=a94f97e8-6060-473c-92bc-75030c79b628,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "address": "fa:16:3e:0b:28:e3", "network": {"id": "e729c920-47a0-485f-a4f2-ea061c5a8a32", "bridge": "br-int", "label": "tempest-network-smoke--1606482411", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94ef64eb-51", "ovs_interfaceid": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Jan 31 01:35:24 np0005603500 nova_compute[182934]: 2026-01-31 06:35:24.534 182938 DEBUG nova.network.os_vif_util [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "address": "fa:16:3e:0b:28:e3", "network": {"id": "e729c920-47a0-485f-a4f2-ea061c5a8a32", "bridge": "br-int", "label": "tempest-network-smoke--1606482411", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94ef64eb-51", "ovs_interfaceid": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:35:24 np0005603500 nova_compute[182934]: 2026-01-31 06:35:24.535 182938 DEBUG nova.network.os_vif_util [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0b:28:e3,bridge_name='br-int',has_traffic_filtering=True,id=94ef64eb-5138-4961-ad73-1296ae99b4f1,network=Network(e729c920-47a0-485f-a4f2-ea061c5a8a32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94ef64eb-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:35:24 np0005603500 nova_compute[182934]: 2026-01-31 06:35:24.538 182938 DEBUG nova.virt.libvirt.guest [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:0b:28:e3"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap94ef64eb-51"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 31 01:35:24 np0005603500 nova_compute[182934]: 2026-01-31 06:35:24.540 182938 DEBUG nova.virt.libvirt.guest [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:0b:28:e3"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap94ef64eb-51"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 31 01:35:24 np0005603500 nova_compute[182934]: 2026-01-31 06:35:24.542 182938 DEBUG nova.virt.libvirt.driver [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Attempting to detach device tap94ef64eb-51 from instance a94f97e8-6060-473c-92bc-75030c79b628 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2637
Jan 31 01:35:24 np0005603500 nova_compute[182934]: 2026-01-31 06:35:24.542 182938 DEBUG nova.virt.libvirt.guest [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] detach device xml: <interface type="ethernet">
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <mac address="fa:16:3e:0b:28:e3"/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <model type="virtio"/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <mtu size="1442"/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <target dev="tap94ef64eb-51"/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]: </interface>
Jan 31 01:35:24 np0005603500 nova_compute[182934]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:466
Jan 31 01:35:24 np0005603500 nova_compute[182934]: 2026-01-31 06:35:24.581 182938 DEBUG nova.virt.libvirt.guest [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:0b:28:e3"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap94ef64eb-51"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 31 01:35:24 np0005603500 nova_compute[182934]: 2026-01-31 06:35:24.585 182938 DEBUG nova.virt.libvirt.guest [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:0b:28:e3"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap94ef64eb-51"/></interface>not found in domain: <domain type='kvm' id='3'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <name>instance-00000003</name>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <uuid>a94f97e8-6060-473c-92bc-75030c79b628</uuid>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <nova:name>tempest-TestNetworkBasicOps-server-1745004988</nova:name>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <nova:creationTime>2026-01-31 06:35:20</nova:creationTime>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <nova:flavor name="m1.nano">
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <nova:memory>128</nova:memory>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <nova:disk>1</nova:disk>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <nova:swap>0</nova:swap>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <nova:vcpus>1</nova:vcpus>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  </nova:flavor>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <nova:owner>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  </nova:owner>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <nova:ports>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <nova:port uuid="d66f7017-2344-441d-9926-108c71a6b524">
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </nova:port>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <nova:port uuid="94ef64eb-5138-4961-ad73-1296ae99b4f1">
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <nova:ip type="fixed" address="10.100.0.30" ipVersion="4"/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </nova:port>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  </nova:ports>
Jan 31 01:35:24 np0005603500 nova_compute[182934]: </nova:instance>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <memory unit='KiB'>131072</memory>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <vcpu placement='static'>1</vcpu>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <resource>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <partition>/machine</partition>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  </resource>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <sysinfo type='smbios'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <entry name='manufacturer'>RDO</entry>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <entry name='version'>31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <entry name='serial'>a94f97e8-6060-473c-92bc-75030c79b628</entry>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <entry name='uuid'>a94f97e8-6060-473c-92bc-75030c79b628</entry>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <entry name='family'>Virtual Machine</entry>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <boot dev='hd'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <smbios mode='sysinfo'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <vmcoreinfo state='on'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <cpu mode='custom' match='exact' check='full'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <model fallback='forbid'>EPYC-Rome</model>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <vendor>AMD</vendor>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='require' name='x2apic'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='require' name='tsc-deadline'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='require' name='hypervisor'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='require' name='tsc_adjust'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='require' name='spec-ctrl'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='require' name='stibp'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='require' name='ssbd'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='require' name='cmp_legacy'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='require' name='overflow-recov'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='require' name='succor'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='require' name='ibrs'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='require' name='amd-ssbd'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='require' name='virt-ssbd'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='disable' name='lbrv'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='disable' name='tsc-scale'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='disable' name='vmcb-clean'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='disable' name='flushbyasid'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='disable' name='pause-filter'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='disable' name='pfthreshold'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='disable' name='svme-addr-chk'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='require' name='lfence-always-serializing'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='disable' name='xsaves'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='disable' name='svm'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='require' name='topoext'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='disable' name='npt'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <feature policy='disable' name='nrip-save'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <clock offset='utc'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <timer name='hpet' present='no'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <on_poweroff>destroy</on_poweroff>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <on_reboot>restart</on_reboot>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <on_crash>destroy</on_crash>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <disk type='file' device='disk'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <source file='/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk' index='2'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <backingStore type='file' index='3'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:        <format type='raw'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:        <source file='/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:        <backingStore/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      </backingStore>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target dev='vda' bus='virtio'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='virtio-disk0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <disk type='file' device='cdrom'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <source file='/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk.config' index='1'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <backingStore/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target dev='sda' bus='sata'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <readonly/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='sata0-0-0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='0' model='pcie-root'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pcie.0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='1' port='0x10'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.1'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='2' port='0x11'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.2'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='3' port='0x12'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.3'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='4' port='0x13'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.4'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='5' port='0x14'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.5'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='6' port='0x15'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.6'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='7' port='0x16'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.7'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='8' port='0x17'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.8'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='9' port='0x18'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.9'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='10' port='0x19'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.10'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='11' port='0x1a'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.11'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='12' port='0x1b'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.12'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='13' port='0x1c'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.13'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='14' port='0x1d'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.14'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='15' port='0x1e'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.15'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='16' port='0x1f'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.16'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='17' port='0x20'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.17'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='18' port='0x21'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.18'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='19' port='0x22'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.19'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='20' port='0x23'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.20'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='21' port='0x24'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.21'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='22' port='0x25'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.22'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='23' port='0x26'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.23'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='24' port='0x27'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.24'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target chassis='25' port='0x28'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.25'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model name='pcie-pci-bridge'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='pci.26'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='usb'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <controller type='sata' index='0'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='ide'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <interface type='ethernet'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <mac address='fa:16:3e:a3:78:c9'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target dev='tapd66f7017-23'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model type='virtio'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <mtu size='1442'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='net0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <interface type='ethernet'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <mac address='fa:16:3e:0b:28:e3'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target dev='tap94ef64eb-51'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model type='virtio'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <mtu size='1442'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='net1'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <serial type='pty'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <source path='/dev/pts/0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <log file='/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/console.log' append='off'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target type='isa-serial' port='0'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:        <model name='isa-serial'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      </target>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='serial0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <console type='pty' tty='/dev/pts/0'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <source path='/dev/pts/0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <log file='/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/console.log' append='off'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <target type='serial' port='0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='serial0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </console>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <input type='tablet' bus='usb'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='input0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='usb' bus='0' port='1'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <input type='mouse' bus='ps2'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='input1'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <input type='keyboard' bus='ps2'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='input2'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <listen type='address' address='::0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </graphics>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <audio id='1' type='none'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='video0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <watchdog model='itco' action='reset'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='watchdog0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </watchdog>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <memballoon model='virtio'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <stats period='10'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='balloon0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <rng model='virtio'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <backend model='random'>/dev/urandom</backend>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <alias name='rng0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <label>system_u:system_r:svirt_t:s0:c395,c640</label>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c395,c640</imagelabel>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  </seclabel>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <label>+107:+107</label>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:    <imagelabel>+107:+107</imagelabel>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  </seclabel>
Jan 31 01:35:24 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:35:24 np0005603500 nova_compute[182934]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 31 01:35:24 np0005603500 nova_compute[182934]: 2026-01-31 06:35:24.586 182938 INFO nova.virt.libvirt.driver [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully detached device tap94ef64eb-51 from instance a94f97e8-6060-473c-92bc-75030c79b628 from the persistent domain config.
Jan 31 01:35:24 np0005603500 nova_compute[182934]: 2026-01-31 06:35:24.587 182938 DEBUG nova.virt.libvirt.driver [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] (1/8): Attempting to detach device tap94ef64eb-51 with device alias net1 from instance a94f97e8-6060-473c-92bc-75030c79b628 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2673
Jan 31 01:35:24 np0005603500 nova_compute[182934]: 2026-01-31 06:35:24.587 182938 DEBUG nova.virt.libvirt.guest [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] detach device xml: <interface type="ethernet">
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <mac address="fa:16:3e:0b:28:e3"/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <model type="virtio"/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <mtu size="1442"/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]:  <target dev="tap94ef64eb-51"/>
Jan 31 01:35:24 np0005603500 nova_compute[182934]: </interface>
Jan 31 01:35:24 np0005603500 nova_compute[182934]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:466
Jan 31 01:35:24 np0005603500 kernel: tap94ef64eb-51 (unregistering): left promiscuous mode
Jan 31 01:35:24 np0005603500 NetworkManager[55506]: <info>  [1769841324.6349] device (tap94ef64eb-51): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 01:35:24 np0005603500 ovn_controller[95398]: 2026-01-31T06:35:24Z|00073|binding|INFO|Releasing lport 94ef64eb-5138-4961-ad73-1296ae99b4f1 from this chassis (sb_readonly=0)
Jan 31 01:35:24 np0005603500 ovn_controller[95398]: 2026-01-31T06:35:24Z|00074|binding|INFO|Setting lport 94ef64eb-5138-4961-ad73-1296ae99b4f1 down in Southbound
Jan 31 01:35:24 np0005603500 ovn_controller[95398]: 2026-01-31T06:35:24Z|00075|binding|INFO|Removing iface tap94ef64eb-51 ovn-installed in OVS
Jan 31 01:35:24 np0005603500 nova_compute[182934]: 2026-01-31 06:35:24.639 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:24 np0005603500 nova_compute[182934]: 2026-01-31 06:35:24.646 182938 DEBUG nova.virt.libvirt.driver [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Start waiting for the detach event from libvirt for device tap94ef64eb-51 with device alias net1 for instance a94f97e8-6060-473c-92bc-75030c79b628 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2749
Jan 31 01:35:24 np0005603500 nova_compute[182934]: 2026-01-31 06:35:24.647 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:24.686 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:28:e3 10.100.0.30'], port_security=['fa:16:3e:0b:28:e3 10.100.0.30'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.30/28', 'neutron:device_id': 'a94f97e8-6060-473c-92bc-75030c79b628', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e729c920-47a0-485f-a4f2-ea061c5a8a32', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '859450e8-8706-4a8f-9018-d6c18f1c21cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03b073a9-8472-442a-9685-a489d08f03bf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=94ef64eb-5138-4961-ad73-1296ae99b4f1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:35:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:24.687 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 94ef64eb-5138-4961-ad73-1296ae99b4f1 in datapath e729c920-47a0-485f-a4f2-ea061c5a8a32 unbound from our chassis
Jan 31 01:35:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:24.688 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e729c920-47a0-485f-a4f2-ea061c5a8a32, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:35:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:24.689 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[3bae605a-2bc0-43b7-9bf0-32489f2dc48e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:24.689 104644 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32 namespace which is not needed anymore
Jan 31 01:35:24 np0005603500 neutron-haproxy-ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32[213563]: [NOTICE]   (213567) : haproxy version is 2.8.14-c23fe91
Jan 31 01:35:24 np0005603500 neutron-haproxy-ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32[213563]: [NOTICE]   (213567) : path to executable is /usr/sbin/haproxy
Jan 31 01:35:24 np0005603500 neutron-haproxy-ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32[213563]: [WARNING]  (213567) : Exiting Master process...
Jan 31 01:35:24 np0005603500 podman[213624]: 2026-01-31 06:35:24.795350453 +0000 UTC m=+0.029795430 container kill f3f85e98204668f3b6c1ba589c6727c1ad19e66be515590b04ae310085f67e3e (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.build-date=20260127)
Jan 31 01:35:24 np0005603500 neutron-haproxy-ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32[213563]: [ALERT]    (213567) : Current worker (213569) exited with code 143 (Terminated)
Jan 31 01:35:24 np0005603500 neutron-haproxy-ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32[213563]: [WARNING]  (213567) : All workers exited. Exiting... (0)
Jan 31 01:35:24 np0005603500 systemd[1]: libpod-f3f85e98204668f3b6c1ba589c6727c1ad19e66be515590b04ae310085f67e3e.scope: Deactivated successfully.
Jan 31 01:35:24 np0005603500 podman[213637]: 2026-01-31 06:35:24.838938161 +0000 UTC m=+0.025289156 container died f3f85e98204668f3b6c1ba589c6727c1ad19e66be515590b04ae310085f67e3e (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:35:24 np0005603500 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f3f85e98204668f3b6c1ba589c6727c1ad19e66be515590b04ae310085f67e3e-userdata-shm.mount: Deactivated successfully.
Jan 31 01:35:24 np0005603500 systemd[1]: var-lib-containers-storage-overlay-4aedcbfd3862c81d1aaa6139a48f17f3b4e8d87ef5c06b0c0e4b019097f1ff77-merged.mount: Deactivated successfully.
Jan 31 01:35:24 np0005603500 podman[213637]: 2026-01-31 06:35:24.891916907 +0000 UTC m=+0.078267872 container cleanup f3f85e98204668f3b6c1ba589c6727c1ad19e66be515590b04ae310085f67e3e (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:35:24 np0005603500 systemd[1]: libpod-conmon-f3f85e98204668f3b6c1ba589c6727c1ad19e66be515590b04ae310085f67e3e.scope: Deactivated successfully.
Jan 31 01:35:24 np0005603500 podman[213649]: 2026-01-31 06:35:24.982224913 +0000 UTC m=+0.149297615 container remove f3f85e98204668f3b6c1ba589c6727c1ad19e66be515590b04ae310085f67e3e (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:35:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:24.986 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[4508acc5-3f1d-4d02-991d-cfe7e9d6108b]: (4, ("Sat Jan 31 06:35:24 AM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32 (f3f85e98204668f3b6c1ba589c6727c1ad19e66be515590b04ae310085f67e3e)\nf3f85e98204668f3b6c1ba589c6727c1ad19e66be515590b04ae310085f67e3e\nSat Jan 31 06:35:24 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32 (f3f85e98204668f3b6c1ba589c6727c1ad19e66be515590b04ae310085f67e3e)\nf3f85e98204668f3b6c1ba589c6727c1ad19e66be515590b04ae310085f67e3e\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:24.988 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[16dfd982-0550-4be6-897b-44f967b1f836]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:24.988 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e729c920-47a0-485f-a4f2-ea061c5a8a32.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e729c920-47a0-485f-a4f2-ea061c5a8a32.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:35:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:24.989 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[a28f6190-5ec5-4ca7-aad9-4e82758544d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:24.990 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape729c920-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:35:24 np0005603500 nova_compute[182934]: 2026-01-31 06:35:24.992 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:24 np0005603500 kernel: tape729c920-40: left promiscuous mode
Jan 31 01:35:24 np0005603500 nova_compute[182934]: 2026-01-31 06:35:24.999 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:25 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:25.003 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[a0a51112-b050-46f2-987d-63df1948076e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:25 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:25.026 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[5e6204bc-f7fb-4b24-bc7f-13e85d0e1995]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:25 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:25.027 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[4bb7c488-962f-4e3b-90be-cb6630ae6a58]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:25 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:25.042 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[2b9f2d36-4dd3-467d-8eb3-e2d779d97953]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362929, 'reachable_time': 36343, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213669, 'error': None, 'target': 'ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:25 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:25.045 105168 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e729c920-47a0-485f-a4f2-ea061c5a8a32 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 31 01:35:25 np0005603500 systemd[1]: run-netns-ovnmeta\x2de729c920\x2d47a0\x2d485f\x2da4f2\x2dea061c5a8a32.mount: Deactivated successfully.
Jan 31 01:35:25 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:25.045 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[4f81a7c7-5af3-4f32-b9d9-1a0dccfcc02d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:26 np0005603500 podman[213670]: 2026-01-31 06:35:26.165442752 +0000 UTC m=+0.082088185 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 01:35:26 np0005603500 nova_compute[182934]: 2026-01-31 06:35:26.662 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:35:26 np0005603500 nova_compute[182934]: 2026-01-31 06:35:26.937 182938 DEBUG nova.compute.manager [req-f54a5892-f657-402f-b81b-3b733b6c62f9 req-3dae71dd-5653-454a-84e4-8091523812e8 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received event network-vif-unplugged-94ef64eb-5138-4961-ad73-1296ae99b4f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:35:26 np0005603500 nova_compute[182934]: 2026-01-31 06:35:26.938 182938 DEBUG oslo_concurrency.lockutils [req-f54a5892-f657-402f-b81b-3b733b6c62f9 req-3dae71dd-5653-454a-84e4-8091523812e8 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "a94f97e8-6060-473c-92bc-75030c79b628-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:35:26 np0005603500 nova_compute[182934]: 2026-01-31 06:35:26.938 182938 DEBUG oslo_concurrency.lockutils [req-f54a5892-f657-402f-b81b-3b733b6c62f9 req-3dae71dd-5653-454a-84e4-8091523812e8 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:35:26 np0005603500 nova_compute[182934]: 2026-01-31 06:35:26.938 182938 DEBUG oslo_concurrency.lockutils [req-f54a5892-f657-402f-b81b-3b733b6c62f9 req-3dae71dd-5653-454a-84e4-8091523812e8 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:35:26 np0005603500 nova_compute[182934]: 2026-01-31 06:35:26.939 182938 DEBUG nova.compute.manager [req-f54a5892-f657-402f-b81b-3b733b6c62f9 req-3dae71dd-5653-454a-84e4-8091523812e8 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] No waiting events found dispatching network-vif-unplugged-94ef64eb-5138-4961-ad73-1296ae99b4f1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:35:26 np0005603500 nova_compute[182934]: 2026-01-31 06:35:26.939 182938 WARNING nova.compute.manager [req-f54a5892-f657-402f-b81b-3b733b6c62f9 req-3dae71dd-5653-454a-84e4-8091523812e8 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received unexpected event network-vif-unplugged-94ef64eb-5138-4961-ad73-1296ae99b4f1 for instance with vm_state active and task_state None.
Jan 31 01:35:27 np0005603500 nova_compute[182934]: 2026-01-31 06:35:27.181 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:35:27 np0005603500 nova_compute[182934]: 2026-01-31 06:35:27.182 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:35:27 np0005603500 nova_compute[182934]: 2026-01-31 06:35:27.182 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:35:27 np0005603500 nova_compute[182934]: 2026-01-31 06:35:27.183 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:35:27 np0005603500 nova_compute[182934]: 2026-01-31 06:35:27.877 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:28 np0005603500 nova_compute[182934]: 2026-01-31 06:35:28.228 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:35:28 np0005603500 nova_compute[182934]: 2026-01-31 06:35:28.275 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:35:28 np0005603500 nova_compute[182934]: 2026-01-31 06:35:28.276 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:35:28 np0005603500 nova_compute[182934]: 2026-01-31 06:35:28.289 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:28 np0005603500 nova_compute[182934]: 2026-01-31 06:35:28.330 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:35:28 np0005603500 nova_compute[182934]: 2026-01-31 06:35:28.499 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:35:28 np0005603500 nova_compute[182934]: 2026-01-31 06:35:28.501 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5574MB free_disk=73.18705368041992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:35:28 np0005603500 nova_compute[182934]: 2026-01-31 06:35:28.501 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:35:28 np0005603500 nova_compute[182934]: 2026-01-31 06:35:28.501 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:35:29 np0005603500 nova_compute[182934]: 2026-01-31 06:35:29.170 182938 DEBUG nova.compute.manager [req-e6dccdee-7edb-4e4d-acfa-59295c88fbb2 req-5a580b75-86a1-49f6-8b09-71cf7d12a8ef 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received event network-vif-plugged-94ef64eb-5138-4961-ad73-1296ae99b4f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:35:29 np0005603500 nova_compute[182934]: 2026-01-31 06:35:29.171 182938 DEBUG oslo_concurrency.lockutils [req-e6dccdee-7edb-4e4d-acfa-59295c88fbb2 req-5a580b75-86a1-49f6-8b09-71cf7d12a8ef 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "a94f97e8-6060-473c-92bc-75030c79b628-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:35:29 np0005603500 nova_compute[182934]: 2026-01-31 06:35:29.171 182938 DEBUG oslo_concurrency.lockutils [req-e6dccdee-7edb-4e4d-acfa-59295c88fbb2 req-5a580b75-86a1-49f6-8b09-71cf7d12a8ef 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:35:29 np0005603500 nova_compute[182934]: 2026-01-31 06:35:29.171 182938 DEBUG oslo_concurrency.lockutils [req-e6dccdee-7edb-4e4d-acfa-59295c88fbb2 req-5a580b75-86a1-49f6-8b09-71cf7d12a8ef 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:35:29 np0005603500 nova_compute[182934]: 2026-01-31 06:35:29.171 182938 DEBUG nova.compute.manager [req-e6dccdee-7edb-4e4d-acfa-59295c88fbb2 req-5a580b75-86a1-49f6-8b09-71cf7d12a8ef 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] No waiting events found dispatching network-vif-plugged-94ef64eb-5138-4961-ad73-1296ae99b4f1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:35:29 np0005603500 nova_compute[182934]: 2026-01-31 06:35:29.172 182938 WARNING nova.compute.manager [req-e6dccdee-7edb-4e4d-acfa-59295c88fbb2 req-5a580b75-86a1-49f6-8b09-71cf7d12a8ef 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received unexpected event network-vif-plugged-94ef64eb-5138-4961-ad73-1296ae99b4f1 for instance with vm_state active and task_state None.
Jan 31 01:35:29 np0005603500 nova_compute[182934]: 2026-01-31 06:35:29.646 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Instance a94f97e8-6060-473c-92bc-75030c79b628 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Jan 31 01:35:29 np0005603500 nova_compute[182934]: 2026-01-31 06:35:29.647 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:35:29 np0005603500 nova_compute[182934]: 2026-01-31 06:35:29.647 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:35:29 np0005603500 nova_compute[182934]: 2026-01-31 06:35:29.721 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:35:30 np0005603500 nova_compute[182934]: 2026-01-31 06:35:30.230 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:35:30 np0005603500 nova_compute[182934]: 2026-01-31 06:35:30.741 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:35:30 np0005603500 nova_compute[182934]: 2026-01-31 06:35:30.742 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:35:31 np0005603500 podman[213703]: 2026-01-31 06:35:31.173084959 +0000 UTC m=+0.085257935 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 31 01:35:32 np0005603500 nova_compute[182934]: 2026-01-31 06:35:32.227 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:35:32 np0005603500 nova_compute[182934]: 2026-01-31 06:35:32.228 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:35:32 np0005603500 nova_compute[182934]: 2026-01-31 06:35:32.228 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:35:32 np0005603500 nova_compute[182934]: 2026-01-31 06:35:32.228 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:35:32 np0005603500 nova_compute[182934]: 2026-01-31 06:35:32.229 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:35:32 np0005603500 nova_compute[182934]: 2026-01-31 06:35:32.229 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:35:32 np0005603500 nova_compute[182934]: 2026-01-31 06:35:32.229 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:35:32 np0005603500 nova_compute[182934]: 2026-01-31 06:35:32.880 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:33 np0005603500 podman[213723]: 2026-01-31 06:35:33.131061784 +0000 UTC m=+0.050775797 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 31 01:35:33 np0005603500 nova_compute[182934]: 2026-01-31 06:35:33.144 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:35:33 np0005603500 nova_compute[182934]: 2026-01-31 06:35:33.290 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:34 np0005603500 nova_compute[182934]: 2026-01-31 06:35:34.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:35:34 np0005603500 nova_compute[182934]: 2026-01-31 06:35:34.148 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11834
Jan 31 01:35:34 np0005603500 nova_compute[182934]: 2026-01-31 06:35:34.663 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11843
Jan 31 01:35:35 np0005603500 nova_compute[182934]: 2026-01-31 06:35:35.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:35:35 np0005603500 nova_compute[182934]: 2026-01-31 06:35:35.148 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:35:35 np0005603500 nova_compute[182934]: 2026-01-31 06:35:35.148 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11872
Jan 31 01:35:37 np0005603500 nova_compute[182934]: 2026-01-31 06:35:37.884 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:38 np0005603500 nova_compute[182934]: 2026-01-31 06:35:38.292 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:42 np0005603500 nova_compute[182934]: 2026-01-31 06:35:42.887 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:43 np0005603500 nova_compute[182934]: 2026-01-31 06:35:43.318 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:44 np0005603500 nova_compute[182934]: 2026-01-31 06:35:44.649 182938 WARNING nova.virt.libvirt.driver [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Waiting for libvirt event about the detach of device tap94ef64eb-51 with device alias net1 from instance a94f97e8-6060-473c-92bc-75030c79b628 is timed out.
Jan 31 01:35:44 np0005603500 nova_compute[182934]: 2026-01-31 06:35:44.650 182938 DEBUG nova.virt.libvirt.guest [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:0b:28:e3"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap94ef64eb-51"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 31 01:35:44 np0005603500 nova_compute[182934]: 2026-01-31 06:35:44.657 182938 DEBUG nova.virt.libvirt.guest [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:0b:28:e3"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap94ef64eb-51"/></interface>not found in domain: <domain type='kvm' id='3'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <name>instance-00000003</name>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <uuid>a94f97e8-6060-473c-92bc-75030c79b628</uuid>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <nova:name>tempest-TestNetworkBasicOps-server-1745004988</nova:name>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <nova:creationTime>2026-01-31 06:35:20</nova:creationTime>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <nova:flavor name="m1.nano">
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <nova:memory>128</nova:memory>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <nova:disk>1</nova:disk>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <nova:swap>0</nova:swap>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <nova:vcpus>1</nova:vcpus>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  </nova:flavor>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <nova:owner>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  </nova:owner>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <nova:ports>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <nova:port uuid="d66f7017-2344-441d-9926-108c71a6b524">
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </nova:port>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <nova:port uuid="94ef64eb-5138-4961-ad73-1296ae99b4f1">
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <nova:ip type="fixed" address="10.100.0.30" ipVersion="4"/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </nova:port>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  </nova:ports>
Jan 31 01:35:44 np0005603500 nova_compute[182934]: </nova:instance>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <memory unit='KiB'>131072</memory>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <vcpu placement='static'>1</vcpu>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <resource>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <partition>/machine</partition>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  </resource>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <sysinfo type='smbios'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <entry name='manufacturer'>RDO</entry>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <entry name='version'>31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <entry name='serial'>a94f97e8-6060-473c-92bc-75030c79b628</entry>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <entry name='uuid'>a94f97e8-6060-473c-92bc-75030c79b628</entry>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <entry name='family'>Virtual Machine</entry>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <boot dev='hd'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <smbios mode='sysinfo'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <vmcoreinfo state='on'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <cpu mode='custom' match='exact' check='full'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <model fallback='forbid'>EPYC-Rome</model>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <vendor>AMD</vendor>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='require' name='x2apic'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='require' name='tsc-deadline'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='require' name='hypervisor'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='require' name='tsc_adjust'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='require' name='spec-ctrl'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='require' name='stibp'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='require' name='ssbd'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='require' name='cmp_legacy'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='require' name='overflow-recov'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='require' name='succor'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='require' name='ibrs'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='require' name='amd-ssbd'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='require' name='virt-ssbd'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='disable' name='lbrv'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='disable' name='tsc-scale'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='disable' name='vmcb-clean'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='disable' name='flushbyasid'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='disable' name='pause-filter'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='disable' name='pfthreshold'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='disable' name='svme-addr-chk'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='require' name='lfence-always-serializing'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='disable' name='xsaves'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='disable' name='svm'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='require' name='topoext'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='disable' name='npt'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <feature policy='disable' name='nrip-save'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <clock offset='utc'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <timer name='hpet' present='no'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <on_poweroff>destroy</on_poweroff>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <on_reboot>restart</on_reboot>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <on_crash>destroy</on_crash>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <disk type='file' device='disk'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <source file='/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk' index='2'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <backingStore type='file' index='3'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:        <format type='raw'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:        <source file='/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:        <backingStore/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      </backingStore>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target dev='vda' bus='virtio'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='virtio-disk0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <disk type='file' device='cdrom'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <source file='/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk.config' index='1'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <backingStore/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target dev='sda' bus='sata'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <readonly/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='sata0-0-0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='0' model='pcie-root'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pcie.0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='1' port='0x10'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.1'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='2' port='0x11'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.2'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='3' port='0x12'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.3'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='4' port='0x13'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.4'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='5' port='0x14'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.5'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='6' port='0x15'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.6'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='7' port='0x16'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.7'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='8' port='0x17'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.8'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='9' port='0x18'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.9'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='10' port='0x19'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.10'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='11' port='0x1a'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.11'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='12' port='0x1b'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.12'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='13' port='0x1c'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.13'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='14' port='0x1d'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.14'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='15' port='0x1e'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.15'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='16' port='0x1f'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.16'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='17' port='0x20'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.17'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='18' port='0x21'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.18'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='19' port='0x22'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.19'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='20' port='0x23'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.20'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='21' port='0x24'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.21'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='22' port='0x25'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.22'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='23' port='0x26'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.23'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='24' port='0x27'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.24'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target chassis='25' port='0x28'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.25'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model name='pcie-pci-bridge'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='pci.26'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='usb'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <controller type='sata' index='0'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='ide'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <interface type='ethernet'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <mac address='fa:16:3e:a3:78:c9'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target dev='tapd66f7017-23'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model type='virtio'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <mtu size='1442'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='net0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <serial type='pty'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <source path='/dev/pts/0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <log file='/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/console.log' append='off'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target type='isa-serial' port='0'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:        <model name='isa-serial'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      </target>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='serial0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <console type='pty' tty='/dev/pts/0'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <source path='/dev/pts/0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <log file='/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/console.log' append='off'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <target type='serial' port='0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='serial0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </console>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <input type='tablet' bus='usb'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='input0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='usb' bus='0' port='1'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <input type='mouse' bus='ps2'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='input1'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <input type='keyboard' bus='ps2'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='input2'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <listen type='address' address='::0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </graphics>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <audio id='1' type='none'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='video0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <watchdog model='itco' action='reset'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='watchdog0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </watchdog>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <memballoon model='virtio'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <stats period='10'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='balloon0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <rng model='virtio'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <backend model='random'>/dev/urandom</backend>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <alias name='rng0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <label>system_u:system_r:svirt_t:s0:c395,c640</label>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c395,c640</imagelabel>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  </seclabel>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <label>+107:+107</label>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <imagelabel>+107:+107</imagelabel>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  </seclabel>
Jan 31 01:35:44 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:35:44 np0005603500 nova_compute[182934]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 31 01:35:44 np0005603500 nova_compute[182934]: 2026-01-31 06:35:44.659 182938 INFO nova.virt.libvirt.driver [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully detached device tap94ef64eb-51 from instance a94f97e8-6060-473c-92bc-75030c79b628 from the live domain config.
Jan 31 01:35:44 np0005603500 nova_compute[182934]: 2026-01-31 06:35:44.660 182938 DEBUG nova.virt.libvirt.vif [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:34:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1745004988',display_name='tempest-TestNetworkBasicOps-server-1745004988',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1745004988',id=3,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMI2O3WgANh2Ex39GfZpZtfGNn6Lqkeu95Vh4npQIMUyjsxGzg+vjYJnFgs6NBrTXoqH34OoaB7KMRljyOaVLfMxIzoP5XGG7G//VOeFqutEe9mXsK6+VunzjSElPnmqPg==',key_name='tempest-TestNetworkBasicOps-578042830',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:34:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-0ulgrsf7',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:34:35Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=a94f97e8-6060-473c-92bc-75030c79b628,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "address": "fa:16:3e:0b:28:e3", "network": {"id": "e729c920-47a0-485f-a4f2-ea061c5a8a32", "bridge": "br-int", "label": "tempest-network-smoke--1606482411", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94ef64eb-51", "ovs_interfaceid": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Jan 31 01:35:44 np0005603500 nova_compute[182934]: 2026-01-31 06:35:44.660 182938 DEBUG nova.network.os_vif_util [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "address": "fa:16:3e:0b:28:e3", "network": {"id": "e729c920-47a0-485f-a4f2-ea061c5a8a32", "bridge": "br-int", "label": "tempest-network-smoke--1606482411", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94ef64eb-51", "ovs_interfaceid": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:35:44 np0005603500 nova_compute[182934]: 2026-01-31 06:35:44.661 182938 DEBUG nova.network.os_vif_util [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0b:28:e3,bridge_name='br-int',has_traffic_filtering=True,id=94ef64eb-5138-4961-ad73-1296ae99b4f1,network=Network(e729c920-47a0-485f-a4f2-ea061c5a8a32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94ef64eb-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:35:44 np0005603500 nova_compute[182934]: 2026-01-31 06:35:44.661 182938 DEBUG os_vif [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0b:28:e3,bridge_name='br-int',has_traffic_filtering=True,id=94ef64eb-5138-4961-ad73-1296ae99b4f1,network=Network(e729c920-47a0-485f-a4f2-ea061c5a8a32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94ef64eb-51') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 31 01:35:44 np0005603500 nova_compute[182934]: 2026-01-31 06:35:44.663 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:44 np0005603500 nova_compute[182934]: 2026-01-31 06:35:44.663 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94ef64eb-51, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:35:44 np0005603500 nova_compute[182934]: 2026-01-31 06:35:44.665 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:44 np0005603500 nova_compute[182934]: 2026-01-31 06:35:44.667 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:35:44 np0005603500 nova_compute[182934]: 2026-01-31 06:35:44.668 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:44 np0005603500 nova_compute[182934]: 2026-01-31 06:35:44.670 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:44 np0005603500 nova_compute[182934]: 2026-01-31 06:35:44.671 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=c72d40ce-455f-4c58-a65d-174be8d761c4) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:35:44 np0005603500 nova_compute[182934]: 2026-01-31 06:35:44.671 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:44 np0005603500 nova_compute[182934]: 2026-01-31 06:35:44.673 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:35:44 np0005603500 nova_compute[182934]: 2026-01-31 06:35:44.675 182938 INFO os_vif [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0b:28:e3,bridge_name='br-int',has_traffic_filtering=True,id=94ef64eb-5138-4961-ad73-1296ae99b4f1,network=Network(e729c920-47a0-485f-a4f2-ea061c5a8a32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94ef64eb-51')
Jan 31 01:35:44 np0005603500 nova_compute[182934]: 2026-01-31 06:35:44.676 182938 DEBUG nova.virt.driver [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-1745004988', uuid='a94f97e8-6060-473c-92bc-75030c79b628'), owner=OwnerMeta(userid='dddc34b0385a49a5bd9bf081ed29e9fd', username='tempest-TestNetworkBasicOps-1355800406-project-member', projectid='829310cd8381494e96216dba067ff8d3', projectname='tempest-TestNetworkBasicOps-1355800406'), image=ImageMeta(id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus='sata',hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus='virtio',hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus='usb',hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type='q35',hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model='usbtablet',hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model='virtio',hw_video_ram=<?>,hw_vif_model='virtio',hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "d66f7017-2344-441d-9926-108c71a6b524", "address": "fa:16:3e:a3:78:c9", "network": {"id": "d9478fb7-5187-4733-899d-45464c14414d", "bridge": "br-int", "label": "tempest-network-smoke--1757708758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66f7017-23", "ovs_interfaceid": "d66f7017-2344-441d-9926-108c71a6b524", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1769841344.6764438) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Jan 31 01:35:44 np0005603500 nova_compute[182934]: 2026-01-31 06:35:44.677 182938 DEBUG nova.virt.libvirt.guest [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <nova:name>tempest-TestNetworkBasicOps-server-1745004988</nova:name>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <nova:creationTime>2026-01-31 06:35:44</nova:creationTime>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <nova:flavor name="m1.nano">
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <nova:memory>128</nova:memory>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <nova:disk>1</nova:disk>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <nova:swap>0</nova:swap>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <nova:vcpus>1</nova:vcpus>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  </nova:flavor>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <nova:owner>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  </nova:owner>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  <nova:ports>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    <nova:port uuid="d66f7017-2344-441d-9926-108c71a6b524">
Jan 31 01:35:44 np0005603500 nova_compute[182934]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:    </nova:port>
Jan 31 01:35:44 np0005603500 nova_compute[182934]:  </nova:ports>
Jan 31 01:35:44 np0005603500 nova_compute[182934]: </nova:instance>
Jan 31 01:35:44 np0005603500 nova_compute[182934]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:356
Jan 31 01:35:46 np0005603500 podman[213756]: 2026-01-31 06:35:46.140281614 +0000 UTC m=+0.046189942 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:35:46 np0005603500 podman[213755]: 2026-01-31 06:35:46.149540459 +0000 UTC m=+0.057478142 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 31 01:35:47 np0005603500 nova_compute[182934]: 2026-01-31 06:35:47.008 182938 DEBUG nova.compute.manager [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received event network-vif-deleted-94ef64eb-5138-4961-ad73-1296ae99b4f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:35:47 np0005603500 nova_compute[182934]: 2026-01-31 06:35:47.008 182938 INFO nova.compute.manager [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Neutron deleted interface 94ef64eb-5138-4961-ad73-1296ae99b4f1; detaching it from the instance and deleting it from the info cache
Jan 31 01:35:47 np0005603500 nova_compute[182934]: 2026-01-31 06:35:47.008 182938 DEBUG nova.network.neutron [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Updating instance_info_cache with network_info: [{"id": "d66f7017-2344-441d-9926-108c71a6b524", "address": "fa:16:3e:a3:78:c9", "network": {"id": "d9478fb7-5187-4733-899d-45464c14414d", "bridge": "br-int", "label": "tempest-network-smoke--1757708758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66f7017-23", "ovs_interfaceid": "d66f7017-2344-441d-9926-108c71a6b524", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:35:47 np0005603500 nova_compute[182934]: 2026-01-31 06:35:47.516 182938 DEBUG nova.objects.instance [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lazy-loading 'system_metadata' on Instance uuid a94f97e8-6060-473c-92bc-75030c79b628 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:35:47 np0005603500 nova_compute[182934]: 2026-01-31 06:35:47.702 182938 DEBUG oslo_concurrency.lockutils [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:35:47 np0005603500 nova_compute[182934]: 2026-01-31 06:35:47.702 182938 DEBUG oslo_concurrency.lockutils [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquired lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:35:47 np0005603500 nova_compute[182934]: 2026-01-31 06:35:47.702 182938 DEBUG nova.network.neutron [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Jan 31 01:35:47 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:47.775 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:35:47 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:47.776 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:35:47 np0005603500 nova_compute[182934]: 2026-01-31 06:35:47.775 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.026 182938 DEBUG nova.objects.instance [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lazy-loading 'flavor' on Instance uuid a94f97e8-6060-473c-92bc-75030c79b628 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.319 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:48 np0005603500 ovn_controller[95398]: 2026-01-31T06:35:48Z|00076|binding|INFO|Releasing lport 759ae516-9997-4f0f-b500-4c1a16b6262f from this chassis (sb_readonly=0)
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.458 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.541 182938 DEBUG nova.objects.base [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Object Instance<a94f97e8-6060-473c-92bc-75030c79b628> lazy-loaded attributes: system_metadata,flavor wrapper /usr/lib/python3.9/site-packages/nova/objects/base.py:136
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.543 182938 DEBUG nova.virt.libvirt.vif [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:34:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1745004988',display_name='tempest-TestNetworkBasicOps-server-1745004988',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1745004988',id=3,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMI2O3WgANh2Ex39GfZpZtfGNn6Lqkeu95Vh4npQIMUyjsxGzg+vjYJnFgs6NBrTXoqH34OoaB7KMRljyOaVLfMxIzoP5XGG7G//VOeFqutEe9mXsK6+VunzjSElPnmqPg==',key_name='tempest-TestNetworkBasicOps-578042830',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:34:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-0ulgrsf7',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:34:35Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=a94f97e8-6060-473c-92bc-75030c79b628,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "address": "fa:16:3e:0b:28:e3", "network": {"id": "e729c920-47a0-485f-a4f2-ea061c5a8a32", "bridge": "br-int", "label": "tempest-network-smoke--1606482411", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94ef64eb-51", "ovs_interfaceid": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.543 182938 DEBUG nova.network.os_vif_util [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Converting VIF {"id": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "address": "fa:16:3e:0b:28:e3", "network": {"id": "e729c920-47a0-485f-a4f2-ea061c5a8a32", "bridge": "br-int", "label": "tempest-network-smoke--1606482411", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94ef64eb-51", "ovs_interfaceid": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.544 182938 DEBUG nova.network.os_vif_util [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0b:28:e3,bridge_name='br-int',has_traffic_filtering=True,id=94ef64eb-5138-4961-ad73-1296ae99b4f1,network=Network(e729c920-47a0-485f-a4f2-ea061c5a8a32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94ef64eb-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.548 182938 DEBUG nova.virt.libvirt.guest [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:0b:28:e3"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap94ef64eb-51"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.552 182938 DEBUG nova.virt.libvirt.guest [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:0b:28:e3"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap94ef64eb-51"/></interface>not found in domain: <domain type='kvm' id='3'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <name>instance-00000003</name>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <uuid>a94f97e8-6060-473c-92bc-75030c79b628</uuid>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:name>tempest-TestNetworkBasicOps-server-1745004988</nova:name>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:creationTime>2026-01-31 06:35:44</nova:creationTime>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:flavor name="m1.nano">
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:memory>128</nova:memory>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:disk>1</nova:disk>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:swap>0</nova:swap>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:vcpus>1</nova:vcpus>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </nova:flavor>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:owner>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </nova:owner>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:ports>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:port uuid="d66f7017-2344-441d-9926-108c71a6b524">
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </nova:port>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </nova:ports>
Jan 31 01:35:48 np0005603500 nova_compute[182934]: </nova:instance>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <memory unit='KiB'>131072</memory>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <vcpu placement='static'>1</vcpu>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <resource>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <partition>/machine</partition>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </resource>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <sysinfo type='smbios'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <entry name='manufacturer'>RDO</entry>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <entry name='version'>31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <entry name='serial'>a94f97e8-6060-473c-92bc-75030c79b628</entry>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <entry name='uuid'>a94f97e8-6060-473c-92bc-75030c79b628</entry>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <entry name='family'>Virtual Machine</entry>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <boot dev='hd'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <smbios mode='sysinfo'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <vmcoreinfo state='on'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <cpu mode='custom' match='exact' check='full'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <model fallback='forbid'>EPYC-Rome</model>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <vendor>AMD</vendor>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='x2apic'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='tsc-deadline'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='hypervisor'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='tsc_adjust'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='spec-ctrl'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='stibp'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='ssbd'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='cmp_legacy'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='overflow-recov'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='succor'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='ibrs'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='amd-ssbd'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='virt-ssbd'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='lbrv'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='tsc-scale'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='vmcb-clean'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='flushbyasid'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='pause-filter'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='pfthreshold'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='svme-addr-chk'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='lfence-always-serializing'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='xsaves'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='svm'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='topoext'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='npt'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='nrip-save'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <clock offset='utc'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <timer name='hpet' present='no'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <on_poweroff>destroy</on_poweroff>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <on_reboot>restart</on_reboot>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <on_crash>destroy</on_crash>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <disk type='file' device='disk'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <source file='/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk' index='2'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <backingStore type='file' index='3'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:        <format type='raw'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:        <source file='/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:        <backingStore/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      </backingStore>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target dev='vda' bus='virtio'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='virtio-disk0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <disk type='file' device='cdrom'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <source file='/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk.config' index='1'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <backingStore/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target dev='sda' bus='sata'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <readonly/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='sata0-0-0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='0' model='pcie-root'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pcie.0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='1' port='0x10'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.1'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='2' port='0x11'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.2'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='3' port='0x12'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.3'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='4' port='0x13'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.4'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='5' port='0x14'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.5'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='6' port='0x15'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.6'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='7' port='0x16'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.7'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='8' port='0x17'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.8'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='9' port='0x18'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.9'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='10' port='0x19'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.10'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='11' port='0x1a'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.11'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='12' port='0x1b'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.12'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='13' port='0x1c'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.13'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='14' port='0x1d'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.14'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='15' port='0x1e'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.15'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='16' port='0x1f'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.16'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='17' port='0x20'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.17'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='18' port='0x21'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.18'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='19' port='0x22'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.19'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='20' port='0x23'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.20'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='21' port='0x24'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.21'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='22' port='0x25'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.22'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='23' port='0x26'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.23'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='24' port='0x27'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.24'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='25' port='0x28'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.25'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-pci-bridge'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.26'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='usb'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='sata' index='0'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='ide'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <interface type='ethernet'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <mac address='fa:16:3e:a3:78:c9'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target dev='tapd66f7017-23'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model type='virtio'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <mtu size='1442'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='net0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <serial type='pty'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <source path='/dev/pts/0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <log file='/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/console.log' append='off'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target type='isa-serial' port='0'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:        <model name='isa-serial'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      </target>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='serial0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <console type='pty' tty='/dev/pts/0'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <source path='/dev/pts/0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <log file='/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/console.log' append='off'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target type='serial' port='0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='serial0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </console>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <input type='tablet' bus='usb'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='input0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='usb' bus='0' port='1'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <input type='mouse' bus='ps2'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='input1'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <input type='keyboard' bus='ps2'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='input2'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <listen type='address' address='::0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </graphics>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <audio id='1' type='none'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='video0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <watchdog model='itco' action='reset'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='watchdog0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </watchdog>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <memballoon model='virtio'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <stats period='10'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='balloon0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <rng model='virtio'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <backend model='random'>/dev/urandom</backend>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='rng0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <label>system_u:system_r:svirt_t:s0:c395,c640</label>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c395,c640</imagelabel>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </seclabel>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <label>+107:+107</label>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <imagelabel>+107:+107</imagelabel>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </seclabel>
Jan 31 01:35:48 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:35:48 np0005603500 nova_compute[182934]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.554 182938 DEBUG nova.virt.libvirt.guest [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:0b:28:e3"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap94ef64eb-51"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.559 182938 DEBUG nova.virt.libvirt.guest [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:0b:28:e3"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap94ef64eb-51"/></interface>not found in domain: <domain type='kvm' id='3'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <name>instance-00000003</name>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <uuid>a94f97e8-6060-473c-92bc-75030c79b628</uuid>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:name>tempest-TestNetworkBasicOps-server-1745004988</nova:name>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:creationTime>2026-01-31 06:35:44</nova:creationTime>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:flavor name="m1.nano">
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:memory>128</nova:memory>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:disk>1</nova:disk>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:swap>0</nova:swap>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:vcpus>1</nova:vcpus>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </nova:flavor>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:owner>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </nova:owner>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:ports>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:port uuid="d66f7017-2344-441d-9926-108c71a6b524">
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </nova:port>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </nova:ports>
Jan 31 01:35:48 np0005603500 nova_compute[182934]: </nova:instance>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <memory unit='KiB'>131072</memory>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <vcpu placement='static'>1</vcpu>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <resource>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <partition>/machine</partition>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </resource>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <sysinfo type='smbios'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <entry name='manufacturer'>RDO</entry>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <entry name='version'>31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <entry name='serial'>a94f97e8-6060-473c-92bc-75030c79b628</entry>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <entry name='uuid'>a94f97e8-6060-473c-92bc-75030c79b628</entry>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <entry name='family'>Virtual Machine</entry>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <boot dev='hd'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <smbios mode='sysinfo'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <vmcoreinfo state='on'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <cpu mode='custom' match='exact' check='full'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <model fallback='forbid'>EPYC-Rome</model>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <vendor>AMD</vendor>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='x2apic'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='tsc-deadline'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='hypervisor'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='tsc_adjust'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='spec-ctrl'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='stibp'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='ssbd'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='cmp_legacy'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='overflow-recov'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='succor'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='ibrs'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='amd-ssbd'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='virt-ssbd'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='lbrv'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='tsc-scale'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='vmcb-clean'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='flushbyasid'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='pause-filter'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='pfthreshold'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='svme-addr-chk'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='lfence-always-serializing'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='xsaves'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='svm'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='require' name='topoext'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='npt'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <feature policy='disable' name='nrip-save'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <clock offset='utc'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <timer name='hpet' present='no'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <on_poweroff>destroy</on_poweroff>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <on_reboot>restart</on_reboot>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <on_crash>destroy</on_crash>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <disk type='file' device='disk'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <source file='/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk' index='2'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <backingStore type='file' index='3'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:        <format type='raw'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:        <source file='/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:        <backingStore/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      </backingStore>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target dev='vda' bus='virtio'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='virtio-disk0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <disk type='file' device='cdrom'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <source file='/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/disk.config' index='1'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <backingStore/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target dev='sda' bus='sata'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <readonly/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='sata0-0-0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='0' model='pcie-root'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pcie.0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='1' port='0x10'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.1'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='2' port='0x11'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.2'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='3' port='0x12'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.3'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='4' port='0x13'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.4'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='5' port='0x14'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.5'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='6' port='0x15'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.6'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='7' port='0x16'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.7'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='8' port='0x17'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.8'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='9' port='0x18'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.9'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='10' port='0x19'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.10'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='11' port='0x1a'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.11'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='12' port='0x1b'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.12'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='13' port='0x1c'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.13'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='14' port='0x1d'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.14'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='15' port='0x1e'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.15'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='16' port='0x1f'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.16'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='17' port='0x20'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.17'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='18' port='0x21'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.18'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='19' port='0x22'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.19'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='20' port='0x23'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.20'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='21' port='0x24'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.21'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='22' port='0x25'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.22'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='23' port='0x26'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.23'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='24' port='0x27'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.24'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target chassis='25' port='0x28'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.25'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model name='pcie-pci-bridge'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='pci.26'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='usb'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <controller type='sata' index='0'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='ide'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <interface type='ethernet'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <mac address='fa:16:3e:a3:78:c9'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target dev='tapd66f7017-23'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model type='virtio'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <mtu size='1442'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='net0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <serial type='pty'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <source path='/dev/pts/0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <log file='/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/console.log' append='off'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target type='isa-serial' port='0'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:        <model name='isa-serial'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      </target>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='serial0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <console type='pty' tty='/dev/pts/0'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <source path='/dev/pts/0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <log file='/var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628/console.log' append='off'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <target type='serial' port='0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='serial0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </console>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <input type='tablet' bus='usb'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='input0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='usb' bus='0' port='1'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <input type='mouse' bus='ps2'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='input1'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <input type='keyboard' bus='ps2'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='input2'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <listen type='address' address='::0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </graphics>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <audio id='1' type='none'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='video0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <watchdog model='itco' action='reset'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='watchdog0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </watchdog>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <memballoon model='virtio'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <stats period='10'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='balloon0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <rng model='virtio'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <backend model='random'>/dev/urandom</backend>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <alias name='rng0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <label>system_u:system_r:svirt_t:s0:c395,c640</label>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c395,c640</imagelabel>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </seclabel>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <label>+107:+107</label>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <imagelabel>+107:+107</imagelabel>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </seclabel>
Jan 31 01:35:48 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:35:48 np0005603500 nova_compute[182934]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.560 182938 WARNING nova.virt.libvirt.driver [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Detaching interface fa:16:3e:0b:28:e3 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap94ef64eb-51' not found.
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.562 182938 DEBUG nova.virt.libvirt.vif [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:34:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1745004988',display_name='tempest-TestNetworkBasicOps-server-1745004988',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1745004988',id=3,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMI2O3WgANh2Ex39GfZpZtfGNn6Lqkeu95Vh4npQIMUyjsxGzg+vjYJnFgs6NBrTXoqH34OoaB7KMRljyOaVLfMxIzoP5XGG7G//VOeFqutEe9mXsK6+VunzjSElPnmqPg==',key_name='tempest-TestNetworkBasicOps-578042830',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:34:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-0ulgrsf7',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:34:35Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=a94f97e8-6060-473c-92bc-75030c79b628,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "address": "fa:16:3e:0b:28:e3", "network": {"id": "e729c920-47a0-485f-a4f2-ea061c5a8a32", "bridge": "br-int", "label": "tempest-network-smoke--1606482411", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94ef64eb-51", "ovs_interfaceid": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.562 182938 DEBUG nova.network.os_vif_util [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Converting VIF {"id": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "address": "fa:16:3e:0b:28:e3", "network": {"id": "e729c920-47a0-485f-a4f2-ea061c5a8a32", "bridge": "br-int", "label": "tempest-network-smoke--1606482411", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap94ef64eb-51", "ovs_interfaceid": "94ef64eb-5138-4961-ad73-1296ae99b4f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.564 182938 DEBUG nova.network.os_vif_util [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0b:28:e3,bridge_name='br-int',has_traffic_filtering=True,id=94ef64eb-5138-4961-ad73-1296ae99b4f1,network=Network(e729c920-47a0-485f-a4f2-ea061c5a8a32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94ef64eb-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.565 182938 DEBUG os_vif [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0b:28:e3,bridge_name='br-int',has_traffic_filtering=True,id=94ef64eb-5138-4961-ad73-1296ae99b4f1,network=Network(e729c920-47a0-485f-a4f2-ea061c5a8a32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94ef64eb-51') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.567 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.568 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94ef64eb-51, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.568 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.571 182938 INFO os_vif [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0b:28:e3,bridge_name='br-int',has_traffic_filtering=True,id=94ef64eb-5138-4961-ad73-1296ae99b4f1,network=Network(e729c920-47a0-485f-a4f2-ea061c5a8a32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94ef64eb-51')
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.572 182938 DEBUG nova.virt.driver [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-1745004988', uuid='a94f97e8-6060-473c-92bc-75030c79b628'), owner=OwnerMeta(userid='dddc34b0385a49a5bd9bf081ed29e9fd', username='tempest-TestNetworkBasicOps-1355800406-project-member', projectid='829310cd8381494e96216dba067ff8d3', projectname='tempest-TestNetworkBasicOps-1355800406'), image=ImageMeta(id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus='sata',hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus='virtio',hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus='usb',hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type='q35',hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model='usbtablet',hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model='virtio',hw_video_ram=<?>,hw_vif_model='virtio',hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "d66f7017-2344-441d-9926-108c71a6b524", "address": "fa:16:3e:a3:78:c9", "network": {"id": "d9478fb7-5187-4733-899d-45464c14414d", "bridge": "br-int", "label": "tempest-network-smoke--1757708758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66f7017-23", "ovs_interfaceid": "d66f7017-2344-441d-9926-108c71a6b524", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1769841348.5723946) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Jan 31 01:35:48 np0005603500 nova_compute[182934]: 2026-01-31 06:35:48.573 182938 DEBUG nova.virt.libvirt.guest [req-1aeb56de-2245-4f0b-91ff-a7e7483820ce req-9971bd7f-6e75-48fe-8985-a9f5b40071fb 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:name>tempest-TestNetworkBasicOps-server-1745004988</nova:name>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:creationTime>2026-01-31 06:35:48</nova:creationTime>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:flavor name="m1.nano">
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:memory>128</nova:memory>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:disk>1</nova:disk>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:swap>0</nova:swap>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:vcpus>1</nova:vcpus>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </nova:flavor>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:owner>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </nova:owner>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  <nova:ports>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    <nova:port uuid="d66f7017-2344-441d-9926-108c71a6b524">
Jan 31 01:35:48 np0005603500 nova_compute[182934]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:    </nova:port>
Jan 31 01:35:48 np0005603500 nova_compute[182934]:  </nova:ports>
Jan 31 01:35:48 np0005603500 nova_compute[182934]: </nova:instance>
Jan 31 01:35:48 np0005603500 nova_compute[182934]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:356
Jan 31 01:35:49 np0005603500 nova_compute[182934]: 2026-01-31 06:35:49.456 182938 DEBUG nova.compute.manager [req-a576dbc9-5ffa-4edd-b42c-2fe8795d9d13 req-07d93808-b45b-493a-a577-358823d5f0f6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received event network-changed-d66f7017-2344-441d-9926-108c71a6b524 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:35:49 np0005603500 nova_compute[182934]: 2026-01-31 06:35:49.456 182938 DEBUG nova.compute.manager [req-a576dbc9-5ffa-4edd-b42c-2fe8795d9d13 req-07d93808-b45b-493a-a577-358823d5f0f6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Refreshing instance network info cache due to event network-changed-d66f7017-2344-441d-9926-108c71a6b524. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:35:49 np0005603500 nova_compute[182934]: 2026-01-31 06:35:49.457 182938 DEBUG oslo_concurrency.lockutils [req-a576dbc9-5ffa-4edd-b42c-2fe8795d9d13 req-07d93808-b45b-493a-a577-358823d5f0f6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:35:49 np0005603500 nova_compute[182934]: 2026-01-31 06:35:49.671 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:50 np0005603500 nova_compute[182934]: 2026-01-31 06:35:50.038 182938 DEBUG oslo_concurrency.lockutils [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "a94f97e8-6060-473c-92bc-75030c79b628" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:35:50 np0005603500 nova_compute[182934]: 2026-01-31 06:35:50.039 182938 DEBUG oslo_concurrency.lockutils [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:35:50 np0005603500 nova_compute[182934]: 2026-01-31 06:35:50.039 182938 DEBUG oslo_concurrency.lockutils [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "a94f97e8-6060-473c-92bc-75030c79b628-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:35:50 np0005603500 nova_compute[182934]: 2026-01-31 06:35:50.039 182938 DEBUG oslo_concurrency.lockutils [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:35:50 np0005603500 nova_compute[182934]: 2026-01-31 06:35:50.039 182938 DEBUG oslo_concurrency.lockutils [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:35:50 np0005603500 nova_compute[182934]: 2026-01-31 06:35:50.040 182938 INFO nova.compute.manager [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Terminating instance
Jan 31 01:35:50 np0005603500 nova_compute[182934]: 2026-01-31 06:35:50.552 182938 DEBUG nova.compute.manager [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Jan 31 01:35:50 np0005603500 kernel: tapd66f7017-23 (unregistering): left promiscuous mode
Jan 31 01:35:50 np0005603500 NetworkManager[55506]: <info>  [1769841350.5879] device (tapd66f7017-23): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 01:35:50 np0005603500 ovn_controller[95398]: 2026-01-31T06:35:50Z|00077|binding|INFO|Releasing lport d66f7017-2344-441d-9926-108c71a6b524 from this chassis (sb_readonly=0)
Jan 31 01:35:50 np0005603500 nova_compute[182934]: 2026-01-31 06:35:50.588 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:50 np0005603500 ovn_controller[95398]: 2026-01-31T06:35:50Z|00078|binding|INFO|Setting lport d66f7017-2344-441d-9926-108c71a6b524 down in Southbound
Jan 31 01:35:50 np0005603500 ovn_controller[95398]: 2026-01-31T06:35:50Z|00079|binding|INFO|Removing iface tapd66f7017-23 ovn-installed in OVS
Jan 31 01:35:50 np0005603500 nova_compute[182934]: 2026-01-31 06:35:50.595 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:50 np0005603500 nova_compute[182934]: 2026-01-31 06:35:50.599 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:50 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:50.601 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:78:c9 10.100.0.7'], port_security=['fa:16:3e:a3:78:c9 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'a94f97e8-6060-473c-92bc-75030c79b628', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9478fb7-5187-4733-899d-45464c14414d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '19762263-34af-4bf7-9f2c-13801e872303', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9359aad-cbc6-45c4-a734-bba64ba33f13, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=d66f7017-2344-441d-9926-108c71a6b524) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:35:50 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:50.602 104644 INFO neutron.agent.ovn.metadata.agent [-] Port d66f7017-2344-441d-9926-108c71a6b524 in datapath d9478fb7-5187-4733-899d-45464c14414d unbound from our chassis
Jan 31 01:35:50 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:50.603 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d9478fb7-5187-4733-899d-45464c14414d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:35:50 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:50.605 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[4488e9e9-7b78-4fb8-8f3b-cb8e643bc0e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:50 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:50.605 104644 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d9478fb7-5187-4733-899d-45464c14414d namespace which is not needed anymore
Jan 31 01:35:50 np0005603500 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Deactivated successfully.
Jan 31 01:35:50 np0005603500 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Consumed 15.826s CPU time.
Jan 31 01:35:50 np0005603500 systemd-machined[154375]: Machine qemu-3-instance-00000003 terminated.
Jan 31 01:35:50 np0005603500 neutron-haproxy-ovnmeta-d9478fb7-5187-4733-899d-45464c14414d[213266]: [NOTICE]   (213270) : haproxy version is 2.8.14-c23fe91
Jan 31 01:35:50 np0005603500 neutron-haproxy-ovnmeta-d9478fb7-5187-4733-899d-45464c14414d[213266]: [NOTICE]   (213270) : path to executable is /usr/sbin/haproxy
Jan 31 01:35:50 np0005603500 neutron-haproxy-ovnmeta-d9478fb7-5187-4733-899d-45464c14414d[213266]: [WARNING]  (213270) : Exiting Master process...
Jan 31 01:35:50 np0005603500 podman[213821]: 2026-01-31 06:35:50.727244747 +0000 UTC m=+0.032422554 container kill 68d5e98e3a2ca8fe655a72d74ac2dc62a21fb6a938f3988dcdc16c88a14b355f (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-d9478fb7-5187-4733-899d-45464c14414d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 01:35:50 np0005603500 neutron-haproxy-ovnmeta-d9478fb7-5187-4733-899d-45464c14414d[213266]: [ALERT]    (213270) : Current worker (213272) exited with code 143 (Terminated)
Jan 31 01:35:50 np0005603500 neutron-haproxy-ovnmeta-d9478fb7-5187-4733-899d-45464c14414d[213266]: [WARNING]  (213270) : All workers exited. Exiting... (0)
Jan 31 01:35:50 np0005603500 systemd[1]: libpod-68d5e98e3a2ca8fe655a72d74ac2dc62a21fb6a938f3988dcdc16c88a14b355f.scope: Deactivated successfully.
Jan 31 01:35:50 np0005603500 podman[213837]: 2026-01-31 06:35:50.77161503 +0000 UTC m=+0.024703339 container died 68d5e98e3a2ca8fe655a72d74ac2dc62a21fb6a938f3988dcdc16c88a14b355f (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-d9478fb7-5187-4733-899d-45464c14414d, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 01:35:50 np0005603500 nova_compute[182934]: 2026-01-31 06:35:50.774 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:50 np0005603500 nova_compute[182934]: 2026-01-31 06:35:50.777 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:50 np0005603500 nova_compute[182934]: 2026-01-31 06:35:50.807 182938 INFO nova.virt.libvirt.driver [-] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Instance destroyed successfully.
Jan 31 01:35:50 np0005603500 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-68d5e98e3a2ca8fe655a72d74ac2dc62a21fb6a938f3988dcdc16c88a14b355f-userdata-shm.mount: Deactivated successfully.
Jan 31 01:35:50 np0005603500 nova_compute[182934]: 2026-01-31 06:35:50.808 182938 DEBUG nova.objects.instance [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'resources' on Instance uuid a94f97e8-6060-473c-92bc-75030c79b628 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:35:50 np0005603500 systemd[1]: var-lib-containers-storage-overlay-43d8a40676991a3232a4371b6dfea0e89a6bd5df7c6a8e57ce9669c72791d5d3-merged.mount: Deactivated successfully.
Jan 31 01:35:50 np0005603500 podman[213837]: 2026-01-31 06:35:50.823494681 +0000 UTC m=+0.076582970 container cleanup 68d5e98e3a2ca8fe655a72d74ac2dc62a21fb6a938f3988dcdc16c88a14b355f (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-d9478fb7-5187-4733-899d-45464c14414d, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 01:35:50 np0005603500 systemd[1]: libpod-conmon-68d5e98e3a2ca8fe655a72d74ac2dc62a21fb6a938f3988dcdc16c88a14b355f.scope: Deactivated successfully.
Jan 31 01:35:50 np0005603500 podman[213839]: 2026-01-31 06:35:50.844290873 +0000 UTC m=+0.085198564 container remove 68d5e98e3a2ca8fe655a72d74ac2dc62a21fb6a938f3988dcdc16c88a14b355f (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-d9478fb7-5187-4733-899d-45464c14414d, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127)
Jan 31 01:35:50 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:50.848 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[190d8d4a-be10-4cb7-9881-f11d0b007d8f]: (4, ("Sat Jan 31 06:35:50 AM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-d9478fb7-5187-4733-899d-45464c14414d (68d5e98e3a2ca8fe655a72d74ac2dc62a21fb6a938f3988dcdc16c88a14b355f)\n68d5e98e3a2ca8fe655a72d74ac2dc62a21fb6a938f3988dcdc16c88a14b355f\nSat Jan 31 06:35:50 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d9478fb7-5187-4733-899d-45464c14414d (68d5e98e3a2ca8fe655a72d74ac2dc62a21fb6a938f3988dcdc16c88a14b355f)\n68d5e98e3a2ca8fe655a72d74ac2dc62a21fb6a938f3988dcdc16c88a14b355f\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:50 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:50.850 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[d077e47b-798b-435b-be38-1be8957f367f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:50 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:50.851 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9478fb7-5187-4733-899d-45464c14414d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9478fb7-5187-4733-899d-45464c14414d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:35:50 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:50.851 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[4ba67213-4e46-4173-a659-c15d258fd830]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:50 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:50.852 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9478fb7-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:35:50 np0005603500 kernel: tapd9478fb7-50: left promiscuous mode
Jan 31 01:35:50 np0005603500 nova_compute[182934]: 2026-01-31 06:35:50.855 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:50 np0005603500 nova_compute[182934]: 2026-01-31 06:35:50.862 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:50 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:50.865 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[81f35cd3-bccf-417b-bad0-cd8dc4b04d52]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:50 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:50.881 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[d5694337-6f01-4cef-aef3-71854ebf7e68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:50 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:50.883 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[541d09bb-42c6-4cc6-8e5b-7bef43b11d42]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:50 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:50.895 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[7b913004-aba5-4f55-a058-ad1dc11dfc62]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358463, 'reachable_time': 16250, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213888, 'error': None, 'target': 'ovnmeta-d9478fb7-5187-4733-899d-45464c14414d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:50 np0005603500 systemd[1]: run-netns-ovnmeta\x2dd9478fb7\x2d5187\x2d4733\x2d899d\x2d45464c14414d.mount: Deactivated successfully.
Jan 31 01:35:50 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:50.900 105168 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d9478fb7-5187-4733-899d-45464c14414d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 31 01:35:50 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:50.900 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[ae753825-4006-4bc9-995a-a02295d036e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.071 182938 DEBUG nova.network.neutron [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Updating instance_info_cache with network_info: [{"id": "d66f7017-2344-441d-9926-108c71a6b524", "address": "fa:16:3e:a3:78:c9", "network": {"id": "d9478fb7-5187-4733-899d-45464c14414d", "bridge": "br-int", "label": "tempest-network-smoke--1757708758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66f7017-23", "ovs_interfaceid": "d66f7017-2344-441d-9926-108c71a6b524", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.315 182938 DEBUG nova.virt.libvirt.vif [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:34:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1745004988',display_name='tempest-TestNetworkBasicOps-server-1745004988',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1745004988',id=3,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMI2O3WgANh2Ex39GfZpZtfGNn6Lqkeu95Vh4npQIMUyjsxGzg+vjYJnFgs6NBrTXoqH34OoaB7KMRljyOaVLfMxIzoP5XGG7G//VOeFqutEe9mXsK6+VunzjSElPnmqPg==',key_name='tempest-TestNetworkBasicOps-578042830',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:34:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-0ulgrsf7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:34:35Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=a94f97e8-6060-473c-92bc-75030c79b628,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d66f7017-2344-441d-9926-108c71a6b524", "address": "fa:16:3e:a3:78:c9", "network": {"id": "d9478fb7-5187-4733-899d-45464c14414d", "bridge": "br-int", "label": "tempest-network-smoke--1757708758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66f7017-23", "ovs_interfaceid": "d66f7017-2344-441d-9926-108c71a6b524", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.316 182938 DEBUG nova.network.os_vif_util [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "d66f7017-2344-441d-9926-108c71a6b524", "address": "fa:16:3e:a3:78:c9", "network": {"id": "d9478fb7-5187-4733-899d-45464c14414d", "bridge": "br-int", "label": "tempest-network-smoke--1757708758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66f7017-23", "ovs_interfaceid": "d66f7017-2344-441d-9926-108c71a6b524", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.317 182938 DEBUG nova.network.os_vif_util [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a3:78:c9,bridge_name='br-int',has_traffic_filtering=True,id=d66f7017-2344-441d-9926-108c71a6b524,network=Network(d9478fb7-5187-4733-899d-45464c14414d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd66f7017-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.317 182938 DEBUG os_vif [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a3:78:c9,bridge_name='br-int',has_traffic_filtering=True,id=d66f7017-2344-441d-9926-108c71a6b524,network=Network(d9478fb7-5187-4733-899d-45464c14414d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd66f7017-23') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.319 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.319 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd66f7017-23, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.321 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.324 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.325 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.325 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=9fd9bdb5-9f6b-490f-bb60-50946c9c306a) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.326 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.326 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.329 182938 INFO os_vif [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a3:78:c9,bridge_name='br-int',has_traffic_filtering=True,id=d66f7017-2344-441d-9926-108c71a6b524,network=Network(d9478fb7-5187-4733-899d-45464c14414d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd66f7017-23')
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.329 182938 INFO nova.virt.libvirt.driver [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Deleting instance files /var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628_del
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.330 182938 INFO nova.virt.libvirt.driver [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Deletion of /var/lib/nova/instances/a94f97e8-6060-473c-92bc-75030c79b628_del complete
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.580 182938 DEBUG oslo_concurrency.lockutils [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Releasing lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.583 182938 DEBUG oslo_concurrency.lockutils [req-a576dbc9-5ffa-4edd-b42c-2fe8795d9d13 req-07d93808-b45b-493a-a577-358823d5f0f6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.584 182938 DEBUG nova.network.neutron [req-a576dbc9-5ffa-4edd-b42c-2fe8795d9d13 req-07d93808-b45b-493a-a577-358823d5f0f6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Refreshing network info cache for port d66f7017-2344-441d-9926-108c71a6b524 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.774 182938 DEBUG nova.compute.manager [req-05507367-2570-46b1-ab07-de291015b943 req-57ccd9eb-fee1-4459-86fa-6e9e969ad4b7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received event network-vif-unplugged-d66f7017-2344-441d-9926-108c71a6b524 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.775 182938 DEBUG oslo_concurrency.lockutils [req-05507367-2570-46b1-ab07-de291015b943 req-57ccd9eb-fee1-4459-86fa-6e9e969ad4b7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "a94f97e8-6060-473c-92bc-75030c79b628-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.775 182938 DEBUG oslo_concurrency.lockutils [req-05507367-2570-46b1-ab07-de291015b943 req-57ccd9eb-fee1-4459-86fa-6e9e969ad4b7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.776 182938 DEBUG oslo_concurrency.lockutils [req-05507367-2570-46b1-ab07-de291015b943 req-57ccd9eb-fee1-4459-86fa-6e9e969ad4b7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.776 182938 DEBUG nova.compute.manager [req-05507367-2570-46b1-ab07-de291015b943 req-57ccd9eb-fee1-4459-86fa-6e9e969ad4b7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] No waiting events found dispatching network-vif-unplugged-d66f7017-2344-441d-9926-108c71a6b524 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.776 182938 DEBUG nova.compute.manager [req-05507367-2570-46b1-ab07-de291015b943 req-57ccd9eb-fee1-4459-86fa-6e9e969ad4b7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received event network-vif-unplugged-d66f7017-2344-441d-9926-108c71a6b524 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.776 182938 DEBUG nova.compute.manager [req-05507367-2570-46b1-ab07-de291015b943 req-57ccd9eb-fee1-4459-86fa-6e9e969ad4b7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received event network-vif-plugged-d66f7017-2344-441d-9926-108c71a6b524 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.776 182938 DEBUG oslo_concurrency.lockutils [req-05507367-2570-46b1-ab07-de291015b943 req-57ccd9eb-fee1-4459-86fa-6e9e969ad4b7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "a94f97e8-6060-473c-92bc-75030c79b628-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.777 182938 DEBUG oslo_concurrency.lockutils [req-05507367-2570-46b1-ab07-de291015b943 req-57ccd9eb-fee1-4459-86fa-6e9e969ad4b7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.777 182938 DEBUG oslo_concurrency.lockutils [req-05507367-2570-46b1-ab07-de291015b943 req-57ccd9eb-fee1-4459-86fa-6e9e969ad4b7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.777 182938 DEBUG nova.compute.manager [req-05507367-2570-46b1-ab07-de291015b943 req-57ccd9eb-fee1-4459-86fa-6e9e969ad4b7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] No waiting events found dispatching network-vif-plugged-d66f7017-2344-441d-9926-108c71a6b524 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.777 182938 WARNING nova.compute.manager [req-05507367-2570-46b1-ab07-de291015b943 req-57ccd9eb-fee1-4459-86fa-6e9e969ad4b7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received unexpected event network-vif-plugged-d66f7017-2344-441d-9926-108c71a6b524 for instance with vm_state active and task_state deleting.
Jan 31 01:35:51 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:51.777 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.842 182938 INFO nova.compute.manager [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Took 1.29 seconds to destroy the instance on the hypervisor.
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.843 182938 DEBUG oslo.service.backend.eventlet.loopingcall [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.844 182938 DEBUG nova.compute.manager [-] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Jan 31 01:35:51 np0005603500 nova_compute[182934]: 2026-01-31 06:35:51.844 182938 DEBUG nova.network.neutron [-] [instance: a94f97e8-6060-473c-92bc-75030c79b628] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Jan 31 01:35:52 np0005603500 nova_compute[182934]: 2026-01-31 06:35:52.090 182938 DEBUG oslo_concurrency.lockutils [None req-b095564e-1aa1-491b-928b-aaecdd1f1df5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "interface-a94f97e8-6060-473c-92bc-75030c79b628-94ef64eb-5138-4961-ad73-1296ae99b4f1" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 28.575s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:35:53 np0005603500 nova_compute[182934]: 2026-01-31 06:35:53.320 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:53 np0005603500 nova_compute[182934]: 2026-01-31 06:35:53.755 182938 DEBUG nova.network.neutron [-] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:35:53 np0005603500 nova_compute[182934]: 2026-01-31 06:35:53.972 182938 DEBUG nova.compute.manager [req-40b5de82-275b-4443-82d3-5d40a6576747 req-1cb1713b-eb07-400e-9ca7-5ab85b922288 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Received event network-vif-deleted-d66f7017-2344-441d-9926-108c71a6b524 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:35:54 np0005603500 podman[213889]: 2026-01-31 06:35:54.134288606 +0000 UTC m=+0.053447703 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., release=1769056855, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, version=9.7, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 31 01:35:54 np0005603500 nova_compute[182934]: 2026-01-31 06:35:54.267 182938 INFO nova.compute.manager [-] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Took 2.42 seconds to deallocate network for instance.
Jan 31 01:35:54 np0005603500 nova_compute[182934]: 2026-01-31 06:35:54.777 182938 DEBUG oslo_concurrency.lockutils [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:35:54 np0005603500 nova_compute[182934]: 2026-01-31 06:35:54.778 182938 DEBUG oslo_concurrency.lockutils [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:35:54 np0005603500 nova_compute[182934]: 2026-01-31 06:35:54.839 182938 DEBUG nova.compute.provider_tree [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:35:55 np0005603500 nova_compute[182934]: 2026-01-31 06:35:55.280 182938 DEBUG nova.network.neutron [req-a576dbc9-5ffa-4edd-b42c-2fe8795d9d13 req-07d93808-b45b-493a-a577-358823d5f0f6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Updated VIF entry in instance network info cache for port d66f7017-2344-441d-9926-108c71a6b524. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:35:55 np0005603500 nova_compute[182934]: 2026-01-31 06:35:55.280 182938 DEBUG nova.network.neutron [req-a576dbc9-5ffa-4edd-b42c-2fe8795d9d13 req-07d93808-b45b-493a-a577-358823d5f0f6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: a94f97e8-6060-473c-92bc-75030c79b628] Updating instance_info_cache with network_info: [{"id": "d66f7017-2344-441d-9926-108c71a6b524", "address": "fa:16:3e:a3:78:c9", "network": {"id": "d9478fb7-5187-4733-899d-45464c14414d", "bridge": "br-int", "label": "tempest-network-smoke--1757708758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66f7017-23", "ovs_interfaceid": "d66f7017-2344-441d-9926-108c71a6b524", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:35:55 np0005603500 nova_compute[182934]: 2026-01-31 06:35:55.345 182938 DEBUG nova.scheduler.client.report [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:35:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:55.754 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:35:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:55.755 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:35:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:35:55.755 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:35:55 np0005603500 nova_compute[182934]: 2026-01-31 06:35:55.788 182938 DEBUG oslo_concurrency.lockutils [req-a576dbc9-5ffa-4edd-b42c-2fe8795d9d13 req-07d93808-b45b-493a-a577-358823d5f0f6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-a94f97e8-6060-473c-92bc-75030c79b628" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:35:55 np0005603500 nova_compute[182934]: 2026-01-31 06:35:55.860 182938 DEBUG oslo_concurrency.lockutils [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:35:55 np0005603500 nova_compute[182934]: 2026-01-31 06:35:55.890 182938 INFO nova.scheduler.client.report [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Deleted allocations for instance a94f97e8-6060-473c-92bc-75030c79b628
Jan 31 01:35:56 np0005603500 nova_compute[182934]: 2026-01-31 06:35:56.327 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:35:56 np0005603500 nova_compute[182934]: 2026-01-31 06:35:56.906 182938 DEBUG oslo_concurrency.lockutils [None req-3ea7bbee-6166-481f-9fd3-de2828b65669 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "a94f97e8-6060-473c-92bc-75030c79b628" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.867s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:35:57 np0005603500 podman[213911]: 2026-01-31 06:35:57.284052873 +0000 UTC m=+0.067138019 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 01:35:58 np0005603500 nova_compute[182934]: 2026-01-31 06:35:58.321 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:01 np0005603500 nova_compute[182934]: 2026-01-31 06:36:01.365 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:02 np0005603500 podman[213937]: 2026-01-31 06:36:02.129048201 +0000 UTC m=+0.051362436 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:36:03 np0005603500 nova_compute[182934]: 2026-01-31 06:36:03.322 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:04 np0005603500 podman[213957]: 2026-01-31 06:36:04.142270845 +0000 UTC m=+0.055478117 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 31 01:36:04 np0005603500 nova_compute[182934]: 2026-01-31 06:36:04.540 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:04 np0005603500 nova_compute[182934]: 2026-01-31 06:36:04.558 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:06 np0005603500 nova_compute[182934]: 2026-01-31 06:36:06.367 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:08 np0005603500 nova_compute[182934]: 2026-01-31 06:36:08.324 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:11 np0005603500 nova_compute[182934]: 2026-01-31 06:36:11.388 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:13 np0005603500 nova_compute[182934]: 2026-01-31 06:36:13.326 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:14 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:14.808 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:48:f0 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bb2649b-6b65-4e17-a6b3-abb539667aef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29d6cd86-2fe1-46f4-ab6e-26f3373754c7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f3c2d942-7b94-4ad6-ae42-58234043864e) old=Port_Binding(mac=['fa:16:3e:a5:48:f0'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bb2649b-6b65-4e17-a6b3-abb539667aef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:36:14 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:14.810 104644 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f3c2d942-7b94-4ad6-ae42-58234043864e in datapath 5bb2649b-6b65-4e17-a6b3-abb539667aef updated
Jan 31 01:36:14 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:14.811 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5bb2649b-6b65-4e17-a6b3-abb539667aef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:36:14 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:14.812 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[022d4f07-caba-4c9c-afc8-f7a468a6f3a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:36:16 np0005603500 nova_compute[182934]: 2026-01-31 06:36:16.391 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:17 np0005603500 podman[213982]: 2026-01-31 06:36:17.124164344 +0000 UTC m=+0.041458061 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 31 01:36:17 np0005603500 podman[213983]: 2026-01-31 06:36:17.125851618 +0000 UTC m=+0.040322075 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.982 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f199f436bb0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f199f44db50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f199f44d2e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.984 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f199f44dc10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f199f43b550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f199f44d3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f199f44d040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f199f44d160>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f199f44dcd0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f199f44d760>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f199f43bbe0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f199f43b3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f199f44d220>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f199f43baf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f199f43b700>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f199f44d6a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f199f43bdf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f19a53f3b80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f199f451250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f199f44d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f199f43bca0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f199f44d940>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f199f43b340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f199f43b490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f199f44d4f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f199f43b0d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:36:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:36:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:36:18 np0005603500 nova_compute[182934]: 2026-01-31 06:36:18.335 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:21 np0005603500 nova_compute[182934]: 2026-01-31 06:36:21.393 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:23 np0005603500 nova_compute[182934]: 2026-01-31 06:36:23.336 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:25 np0005603500 podman[214025]: 2026-01-31 06:36:25.125877411 +0000 UTC m=+0.043771385 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc.)
Jan 31 01:36:25 np0005603500 nova_compute[182934]: 2026-01-31 06:36:25.179 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:36:26 np0005603500 nova_compute[182934]: 2026-01-31 06:36:26.395 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:28 np0005603500 nova_compute[182934]: 2026-01-31 06:36:28.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:36:28 np0005603500 podman[214046]: 2026-01-31 06:36:28.152436611 +0000 UTC m=+0.070144540 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 01:36:28 np0005603500 nova_compute[182934]: 2026-01-31 06:36:28.337 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:29 np0005603500 nova_compute[182934]: 2026-01-31 06:36:29.022 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:36:29 np0005603500 nova_compute[182934]: 2026-01-31 06:36:29.022 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:36:29 np0005603500 nova_compute[182934]: 2026-01-31 06:36:29.022 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:36:29 np0005603500 nova_compute[182934]: 2026-01-31 06:36:29.023 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:36:29 np0005603500 nova_compute[182934]: 2026-01-31 06:36:29.177 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:36:29 np0005603500 nova_compute[182934]: 2026-01-31 06:36:29.178 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5792MB free_disk=73.21573257446289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:36:29 np0005603500 nova_compute[182934]: 2026-01-31 06:36:29.178 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:36:29 np0005603500 nova_compute[182934]: 2026-01-31 06:36:29.178 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:36:30 np0005603500 nova_compute[182934]: 2026-01-31 06:36:30.451 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:36:30 np0005603500 nova_compute[182934]: 2026-01-31 06:36:30.452 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:36:30 np0005603500 nova_compute[182934]: 2026-01-31 06:36:30.694 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Refreshing inventories for resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:822
Jan 31 01:36:30 np0005603500 nova_compute[182934]: 2026-01-31 06:36:30.931 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Updating ProviderTree inventory for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:786
Jan 31 01:36:30 np0005603500 nova_compute[182934]: 2026-01-31 06:36:30.931 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Updating inventory in ProviderTree for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 31 01:36:30 np0005603500 nova_compute[182934]: 2026-01-31 06:36:30.947 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Refreshing aggregate associations for resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:831
Jan 31 01:36:30 np0005603500 nova_compute[182934]: 2026-01-31 06:36:30.972 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Refreshing trait associations for resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59, traits: COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,HW_CPU_X86_BMI2,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,HW_ARCH_X86_64,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_ABM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_CRB,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_SHA,HW_CPU_X86_CLMUL,HW_CPU_X86_AVX,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_TIS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_FMA3,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SVM,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:843
Jan 31 01:36:30 np0005603500 nova_compute[182934]: 2026-01-31 06:36:30.993 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:36:31 np0005603500 nova_compute[182934]: 2026-01-31 06:36:31.397 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:31 np0005603500 nova_compute[182934]: 2026-01-31 06:36:31.503 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:36:32 np0005603500 nova_compute[182934]: 2026-01-31 06:36:32.016 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:36:32 np0005603500 nova_compute[182934]: 2026-01-31 06:36:32.016 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.838s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:36:33 np0005603500 nova_compute[182934]: 2026-01-31 06:36:33.017 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:36:33 np0005603500 nova_compute[182934]: 2026-01-31 06:36:33.017 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:36:33 np0005603500 nova_compute[182934]: 2026-01-31 06:36:33.017 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:36:33 np0005603500 nova_compute[182934]: 2026-01-31 06:36:33.018 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:36:33 np0005603500 nova_compute[182934]: 2026-01-31 06:36:33.018 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:36:33 np0005603500 nova_compute[182934]: 2026-01-31 06:36:33.018 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:36:33 np0005603500 podman[214073]: 2026-01-31 06:36:33.12626683 +0000 UTC m=+0.044133906 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 01:36:33 np0005603500 nova_compute[182934]: 2026-01-31 06:36:33.143 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:36:33 np0005603500 nova_compute[182934]: 2026-01-31 06:36:33.339 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:35 np0005603500 podman[214092]: 2026-01-31 06:36:35.133449112 +0000 UTC m=+0.047367546 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 31 01:36:35 np0005603500 nova_compute[182934]: 2026-01-31 06:36:35.534 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "5a6339ba-548e-4964-9c45-15df2cf116b5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:36:35 np0005603500 nova_compute[182934]: 2026-01-31 06:36:35.534 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "5a6339ba-548e-4964-9c45-15df2cf116b5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:36:36 np0005603500 nova_compute[182934]: 2026-01-31 06:36:36.051 182938 DEBUG nova.compute.manager [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Jan 31 01:36:36 np0005603500 nova_compute[182934]: 2026-01-31 06:36:36.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:36:36 np0005603500 nova_compute[182934]: 2026-01-31 06:36:36.399 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:36 np0005603500 nova_compute[182934]: 2026-01-31 06:36:36.655 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:36:36 np0005603500 nova_compute[182934]: 2026-01-31 06:36:36.656 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:36:36 np0005603500 nova_compute[182934]: 2026-01-31 06:36:36.664 182938 DEBUG nova.virt.hardware [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Jan 31 01:36:36 np0005603500 nova_compute[182934]: 2026-01-31 06:36:36.664 182938 INFO nova.compute.claims [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Claim successful on node compute-0.ctlplane.example.com
Jan 31 01:36:38 np0005603500 nova_compute[182934]: 2026-01-31 06:36:38.117 182938 DEBUG nova.compute.provider_tree [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:36:38 np0005603500 nova_compute[182934]: 2026-01-31 06:36:38.373 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:38 np0005603500 nova_compute[182934]: 2026-01-31 06:36:38.635 182938 DEBUG nova.scheduler.client.report [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:36:39 np0005603500 nova_compute[182934]: 2026-01-31 06:36:39.147 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.491s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:36:39 np0005603500 nova_compute[182934]: 2026-01-31 06:36:39.148 182938 DEBUG nova.compute.manager [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Jan 31 01:36:39 np0005603500 nova_compute[182934]: 2026-01-31 06:36:39.681 182938 DEBUG nova.compute.manager [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Jan 31 01:36:39 np0005603500 nova_compute[182934]: 2026-01-31 06:36:39.682 182938 DEBUG nova.network.neutron [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Jan 31 01:36:40 np0005603500 nova_compute[182934]: 2026-01-31 06:36:40.175 182938 DEBUG nova.policy [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '829310cd8381494e96216dba067ff8d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Jan 31 01:36:40 np0005603500 nova_compute[182934]: 2026-01-31 06:36:40.204 182938 INFO nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 31 01:36:40 np0005603500 nova_compute[182934]: 2026-01-31 06:36:40.734 182938 DEBUG nova.compute.manager [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.401 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.496 182938 DEBUG nova.network.neutron [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Successfully created port: 6e13d68c-b169-42b0-bafe-2026dd7b7c9f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.777 182938 DEBUG nova.compute.manager [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.778 182938 DEBUG nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.779 182938 INFO nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Creating image(s)
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.780 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "/var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.780 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.781 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.781 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.786 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.788 182938 DEBUG oslo_concurrency.processutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.832 182938 DEBUG oslo_concurrency.processutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.833 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "d9035e96dc857b84194c2a2b496d294827e2de39" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.834 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.835 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.839 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.840 182938 DEBUG oslo_concurrency.processutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.888 182938 DEBUG oslo_concurrency.processutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.889 182938 DEBUG oslo_concurrency.processutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.985 182938 DEBUG oslo_concurrency.processutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5/disk 1073741824" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.986 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:36:41 np0005603500 nova_compute[182934]: 2026-01-31 06:36:41.987 182938 DEBUG oslo_concurrency.processutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:36:42 np0005603500 nova_compute[182934]: 2026-01-31 06:36:42.036 182938 DEBUG oslo_concurrency.processutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:36:42 np0005603500 nova_compute[182934]: 2026-01-31 06:36:42.037 182938 DEBUG nova.virt.disk.api [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Checking if we can resize image /var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Jan 31 01:36:42 np0005603500 nova_compute[182934]: 2026-01-31 06:36:42.037 182938 DEBUG oslo_concurrency.processutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:36:42 np0005603500 nova_compute[182934]: 2026-01-31 06:36:42.085 182938 DEBUG oslo_concurrency.processutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:36:42 np0005603500 nova_compute[182934]: 2026-01-31 06:36:42.086 182938 DEBUG nova.virt.disk.api [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Cannot resize image /var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Jan 31 01:36:42 np0005603500 nova_compute[182934]: 2026-01-31 06:36:42.087 182938 DEBUG nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Jan 31 01:36:42 np0005603500 nova_compute[182934]: 2026-01-31 06:36:42.087 182938 DEBUG nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Ensure instance console log exists: /var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Jan 31 01:36:42 np0005603500 nova_compute[182934]: 2026-01-31 06:36:42.087 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:36:42 np0005603500 nova_compute[182934]: 2026-01-31 06:36:42.088 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:36:42 np0005603500 nova_compute[182934]: 2026-01-31 06:36:42.088 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:36:43 np0005603500 nova_compute[182934]: 2026-01-31 06:36:43.373 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:44 np0005603500 nova_compute[182934]: 2026-01-31 06:36:44.825 182938 DEBUG nova.network.neutron [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Successfully updated port: 6e13d68c-b169-42b0-bafe-2026dd7b7c9f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 31 01:36:45 np0005603500 nova_compute[182934]: 2026-01-31 06:36:45.064 182938 DEBUG nova.compute.manager [req-ba338594-df97-4d05-a357-b7f7e8d7f0c8 req-acf916c3-35a8-4d68-8427-c3ec1192f691 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Received event network-changed-6e13d68c-b169-42b0-bafe-2026dd7b7c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:36:45 np0005603500 nova_compute[182934]: 2026-01-31 06:36:45.065 182938 DEBUG nova.compute.manager [req-ba338594-df97-4d05-a357-b7f7e8d7f0c8 req-acf916c3-35a8-4d68-8427-c3ec1192f691 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Refreshing instance network info cache due to event network-changed-6e13d68c-b169-42b0-bafe-2026dd7b7c9f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:36:45 np0005603500 nova_compute[182934]: 2026-01-31 06:36:45.065 182938 DEBUG oslo_concurrency.lockutils [req-ba338594-df97-4d05-a357-b7f7e8d7f0c8 req-acf916c3-35a8-4d68-8427-c3ec1192f691 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-5a6339ba-548e-4964-9c45-15df2cf116b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:36:45 np0005603500 nova_compute[182934]: 2026-01-31 06:36:45.065 182938 DEBUG oslo_concurrency.lockutils [req-ba338594-df97-4d05-a357-b7f7e8d7f0c8 req-acf916c3-35a8-4d68-8427-c3ec1192f691 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-5a6339ba-548e-4964-9c45-15df2cf116b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:36:45 np0005603500 nova_compute[182934]: 2026-01-31 06:36:45.066 182938 DEBUG nova.network.neutron [req-ba338594-df97-4d05-a357-b7f7e8d7f0c8 req-acf916c3-35a8-4d68-8427-c3ec1192f691 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Refreshing network info cache for port 6e13d68c-b169-42b0-bafe-2026dd7b7c9f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:36:45 np0005603500 nova_compute[182934]: 2026-01-31 06:36:45.351 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "refresh_cache-5a6339ba-548e-4964-9c45-15df2cf116b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:36:46 np0005603500 nova_compute[182934]: 2026-01-31 06:36:46.403 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:46 np0005603500 nova_compute[182934]: 2026-01-31 06:36:46.665 182938 DEBUG nova.network.neutron [req-ba338594-df97-4d05-a357-b7f7e8d7f0c8 req-acf916c3-35a8-4d68-8427-c3ec1192f691 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:36:46 np0005603500 ovn_controller[95398]: 2026-01-31T06:36:46Z|00080|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 31 01:36:48 np0005603500 nova_compute[182934]: 2026-01-31 06:36:48.014 182938 DEBUG nova.network.neutron [req-ba338594-df97-4d05-a357-b7f7e8d7f0c8 req-acf916c3-35a8-4d68-8427-c3ec1192f691 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:36:48 np0005603500 podman[214132]: 2026-01-31 06:36:48.123414466 +0000 UTC m=+0.045788467 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 31 01:36:48 np0005603500 podman[214133]: 2026-01-31 06:36:48.138317412 +0000 UTC m=+0.053487947 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:36:48 np0005603500 nova_compute[182934]: 2026-01-31 06:36:48.374 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:48 np0005603500 nova_compute[182934]: 2026-01-31 06:36:48.523 182938 DEBUG oslo_concurrency.lockutils [req-ba338594-df97-4d05-a357-b7f7e8d7f0c8 req-acf916c3-35a8-4d68-8427-c3ec1192f691 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-5a6339ba-548e-4964-9c45-15df2cf116b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:36:48 np0005603500 nova_compute[182934]: 2026-01-31 06:36:48.523 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquired lock "refresh_cache-5a6339ba-548e-4964-9c45-15df2cf116b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:36:48 np0005603500 nova_compute[182934]: 2026-01-31 06:36:48.524 182938 DEBUG nova.network.neutron [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Jan 31 01:36:49 np0005603500 nova_compute[182934]: 2026-01-31 06:36:49.468 182938 DEBUG nova.network.neutron [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:36:51 np0005603500 nova_compute[182934]: 2026-01-31 06:36:51.405 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:52 np0005603500 nova_compute[182934]: 2026-01-31 06:36:52.691 182938 DEBUG nova.network.neutron [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Updating instance_info_cache with network_info: [{"id": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "address": "fa:16:3e:7a:a2:11", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e13d68c-b1", "ovs_interfaceid": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.198 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Releasing lock "refresh_cache-5a6339ba-548e-4964-9c45-15df2cf116b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.199 182938 DEBUG nova.compute.manager [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Instance network_info: |[{"id": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "address": "fa:16:3e:7a:a2:11", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e13d68c-b1", "ovs_interfaceid": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.203 182938 DEBUG nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Start _get_guest_xml network_info=[{"id": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "address": "fa:16:3e:7a:a2:11", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e13d68c-b1", "ovs_interfaceid": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': '9f613975-b701-42a0-9b35-7d5c4a2cb7f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.207 182938 WARNING nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.209 182938 DEBUG nova.virt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-1600725576', uuid='5a6339ba-548e-4964-9c45-15df2cf116b5'), owner=OwnerMeta(userid='dddc34b0385a49a5bd9bf081ed29e9fd', username='tempest-TestNetworkBasicOps-1355800406-project-member', projectid='829310cd8381494e96216dba067ff8d3', projectname='tempest-TestNetworkBasicOps-1355800406'), image=ImageMeta(id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "address": "fa:16:3e:7a:a2:11", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e13d68c-b1", "ovs_interfaceid": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1769841413.2089648) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.219 182938 DEBUG nova.virt.libvirt.host [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.220 182938 DEBUG nova.virt.libvirt.host [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.223 182938 DEBUG nova.virt.libvirt.host [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.224 182938 DEBUG nova.virt.libvirt.host [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.224 182938 DEBUG nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.225 182938 DEBUG nova.virt.hardware [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T06:29:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9956992e-a3ca-497f-9747-3ae270e07def',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.225 182938 DEBUG nova.virt.hardware [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.225 182938 DEBUG nova.virt.hardware [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.226 182938 DEBUG nova.virt.hardware [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.226 182938 DEBUG nova.virt.hardware [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.226 182938 DEBUG nova.virt.hardware [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.226 182938 DEBUG nova.virt.hardware [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.227 182938 DEBUG nova.virt.hardware [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.227 182938 DEBUG nova.virt.hardware [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.227 182938 DEBUG nova.virt.hardware [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.227 182938 DEBUG nova.virt.hardware [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.231 182938 DEBUG nova.virt.libvirt.vif [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:36:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1600725576',display_name='tempest-TestNetworkBasicOps-server-1600725576',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1600725576',id=4,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBVf4DFKNCoqVbCeDiHmpnUJgB+AhhR5LyMEr+wH5zX90PeFXhDIlV+tsioKRgxnkIW5+o7qlCZ0JP/wGThi+7YhSDHjcF3W9ZrpMft2psVh5chsymmjI3kuU5RAc1Dd6Q==',key_name='tempest-TestNetworkBasicOps-372120458',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-23axq0bw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:36:40Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=5a6339ba-548e-4964-9c45-15df2cf116b5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "address": "fa:16:3e:7a:a2:11", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e13d68c-b1", "ovs_interfaceid": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.232 182938 DEBUG nova.network.os_vif_util [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "address": "fa:16:3e:7a:a2:11", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e13d68c-b1", "ovs_interfaceid": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.233 182938 DEBUG nova.network.os_vif_util [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:a2:11,bridge_name='br-int',has_traffic_filtering=True,id=6e13d68c-b169-42b0-bafe-2026dd7b7c9f,network=Network(5bb2649b-6b65-4e17-a6b3-abb539667aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e13d68c-b1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.234 182938 DEBUG nova.objects.instance [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5a6339ba-548e-4964-9c45-15df2cf116b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.377 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.742 182938 DEBUG nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] End _get_guest_xml xml=<domain type="kvm">
Jan 31 01:36:53 np0005603500 nova_compute[182934]:  <uuid>5a6339ba-548e-4964-9c45-15df2cf116b5</uuid>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:  <name>instance-00000004</name>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:  <memory>131072</memory>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:  <vcpu>1</vcpu>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <nova:name>tempest-TestNetworkBasicOps-server-1600725576</nova:name>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <nova:creationTime>2026-01-31 06:36:53</nova:creationTime>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <nova:flavor name="m1.nano">
Jan 31 01:36:53 np0005603500 nova_compute[182934]:        <nova:memory>128</nova:memory>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:        <nova:disk>1</nova:disk>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:        <nova:swap>0</nova:swap>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:        <nova:vcpus>1</nova:vcpus>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      </nova:flavor>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <nova:owner>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:        <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:        <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      </nova:owner>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <nova:ports>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:        <nova:port uuid="6e13d68c-b169-42b0-bafe-2026dd7b7c9f">
Jan 31 01:36:53 np0005603500 nova_compute[182934]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:        </nova:port>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      </nova:ports>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    </nova:instance>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:  <sysinfo type="smbios">
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <entry name="manufacturer">RDO</entry>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <entry name="product">OpenStack Compute</entry>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <entry name="serial">5a6339ba-548e-4964-9c45-15df2cf116b5</entry>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <entry name="uuid">5a6339ba-548e-4964-9c45-15df2cf116b5</entry>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <entry name="family">Virtual Machine</entry>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <boot dev="hd"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <smbios mode="sysinfo"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <vmcoreinfo/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:  <clock offset="utc">
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <timer name="hpet" present="no"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:  <cpu mode="host-model" match="exact">
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <disk type="file" device="disk">
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5/disk"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <target dev="vda" bus="virtio"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <disk type="file" device="cdrom">
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <driver name="qemu" type="raw" cache="none"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5/disk.config"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <target dev="sda" bus="sata"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <interface type="ethernet">
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <mac address="fa:16:3e:7a:a2:11"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <mtu size="1442"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <target dev="tap6e13d68c-b1"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <serial type="pty">
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <log file="/var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5/console.log" append="off"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <input type="tablet" bus="usb"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <rng model="virtio">
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <backend model="random">/dev/urandom</backend>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <controller type="usb" index="0"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    <memballoon model="virtio">
Jan 31 01:36:53 np0005603500 nova_compute[182934]:      <stats period="10"/>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:36:53 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:36:53 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:36:53 np0005603500 nova_compute[182934]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.744 182938 DEBUG nova.compute.manager [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Preparing to wait for external event network-vif-plugged-6e13d68c-b169-42b0-bafe-2026dd7b7c9f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.744 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "5a6339ba-548e-4964-9c45-15df2cf116b5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.745 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "5a6339ba-548e-4964-9c45-15df2cf116b5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.745 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "5a6339ba-548e-4964-9c45-15df2cf116b5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.745 182938 DEBUG nova.virt.libvirt.vif [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:36:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1600725576',display_name='tempest-TestNetworkBasicOps-server-1600725576',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1600725576',id=4,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBVf4DFKNCoqVbCeDiHmpnUJgB+AhhR5LyMEr+wH5zX90PeFXhDIlV+tsioKRgxnkIW5+o7qlCZ0JP/wGThi+7YhSDHjcF3W9ZrpMft2psVh5chsymmjI3kuU5RAc1Dd6Q==',key_name='tempest-TestNetworkBasicOps-372120458',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-23axq0bw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:36:40Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=5a6339ba-548e-4964-9c45-15df2cf116b5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "address": "fa:16:3e:7a:a2:11", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e13d68c-b1", "ovs_interfaceid": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.746 182938 DEBUG nova.network.os_vif_util [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "address": "fa:16:3e:7a:a2:11", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e13d68c-b1", "ovs_interfaceid": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.746 182938 DEBUG nova.network.os_vif_util [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:a2:11,bridge_name='br-int',has_traffic_filtering=True,id=6e13d68c-b169-42b0-bafe-2026dd7b7c9f,network=Network(5bb2649b-6b65-4e17-a6b3-abb539667aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e13d68c-b1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.747 182938 DEBUG os_vif [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:a2:11,bridge_name='br-int',has_traffic_filtering=True,id=6e13d68c-b169-42b0-bafe-2026dd7b7c9f,network=Network(5bb2649b-6b65-4e17-a6b3-abb539667aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e13d68c-b1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.748 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.748 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.749 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.749 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.750 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '152aa086-c57b-5cd8-b2ed-1b9a172e011c', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.751 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.752 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.755 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.755 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e13d68c-b1, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.756 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap6e13d68c-b1, col_values=(('qos', UUID('51703e61-10f0-4814-8b7c-5c7e0c46a554')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.756 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap6e13d68c-b1, col_values=(('external_ids', {'iface-id': '6e13d68c-b169-42b0-bafe-2026dd7b7c9f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7a:a2:11', 'vm-uuid': '5a6339ba-548e-4964-9c45-15df2cf116b5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.758 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:53 np0005603500 NetworkManager[55506]: <info>  [1769841413.7594] manager: (tap6e13d68c-b1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.761 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.763 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:53 np0005603500 nova_compute[182934]: 2026-01-31 06:36:53.763 182938 INFO os_vif [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:a2:11,bridge_name='br-int',has_traffic_filtering=True,id=6e13d68c-b169-42b0-bafe-2026dd7b7c9f,network=Network(5bb2649b-6b65-4e17-a6b3-abb539667aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e13d68c-b1')
Jan 31 01:36:55 np0005603500 nova_compute[182934]: 2026-01-31 06:36:55.333 182938 DEBUG nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:36:55 np0005603500 nova_compute[182934]: 2026-01-31 06:36:55.334 182938 DEBUG nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:36:55 np0005603500 nova_compute[182934]: 2026-01-31 06:36:55.334 182938 DEBUG nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No VIF found with MAC fa:16:3e:7a:a2:11, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Jan 31 01:36:55 np0005603500 nova_compute[182934]: 2026-01-31 06:36:55.334 182938 INFO nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Using config drive
Jan 31 01:36:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:55.809 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:36:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:55.809 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:36:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:55.810 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:36:56 np0005603500 podman[214177]: 2026-01-31 06:36:56.139627313 +0000 UTC m=+0.061105697 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, io.openshift.expose-services=, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, vcs-type=git)
Jan 31 01:36:57 np0005603500 nova_compute[182934]: 2026-01-31 06:36:57.668 182938 INFO nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Creating config drive at /var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5/disk.config
Jan 31 01:36:57 np0005603500 nova_compute[182934]: 2026-01-31 06:36:57.671 182938 DEBUG oslo_concurrency.processutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmp72bx7mmn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:36:57 np0005603500 nova_compute[182934]: 2026-01-31 06:36:57.788 182938 DEBUG oslo_concurrency.processutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmp72bx7mmn" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:36:57 np0005603500 kernel: tap6e13d68c-b1: entered promiscuous mode
Jan 31 01:36:57 np0005603500 NetworkManager[55506]: <info>  [1769841417.8261] manager: (tap6e13d68c-b1): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Jan 31 01:36:57 np0005603500 ovn_controller[95398]: 2026-01-31T06:36:57Z|00081|binding|INFO|Claiming lport 6e13d68c-b169-42b0-bafe-2026dd7b7c9f for this chassis.
Jan 31 01:36:57 np0005603500 ovn_controller[95398]: 2026-01-31T06:36:57Z|00082|binding|INFO|6e13d68c-b169-42b0-bafe-2026dd7b7c9f: Claiming fa:16:3e:7a:a2:11 10.100.0.10
Jan 31 01:36:57 np0005603500 nova_compute[182934]: 2026-01-31 06:36:57.826 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:57 np0005603500 nova_compute[182934]: 2026-01-31 06:36:57.831 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:57 np0005603500 nova_compute[182934]: 2026-01-31 06:36:57.842 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:57.851 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:a2:11 10.100.0.10'], port_security=['fa:16:3e:7a:a2:11 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '5a6339ba-548e-4964-9c45-15df2cf116b5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bb2649b-6b65-4e17-a6b3-abb539667aef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd0f1208b-2625-4bf0-aaaf-6a0e9a24c37e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29d6cd86-2fe1-46f4-ab6e-26f3373754c7, chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=6e13d68c-b169-42b0-bafe-2026dd7b7c9f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:36:57 np0005603500 systemd-udevd[214215]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:36:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:57.855 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 6e13d68c-b169-42b0-bafe-2026dd7b7c9f in datapath 5bb2649b-6b65-4e17-a6b3-abb539667aef bound to our chassis
Jan 31 01:36:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:57.856 104644 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5bb2649b-6b65-4e17-a6b3-abb539667aef
Jan 31 01:36:57 np0005603500 ovn_controller[95398]: 2026-01-31T06:36:57Z|00083|binding|INFO|Setting lport 6e13d68c-b169-42b0-bafe-2026dd7b7c9f ovn-installed in OVS
Jan 31 01:36:57 np0005603500 ovn_controller[95398]: 2026-01-31T06:36:57Z|00084|binding|INFO|Setting lport 6e13d68c-b169-42b0-bafe-2026dd7b7c9f up in Southbound
Jan 31 01:36:57 np0005603500 nova_compute[182934]: 2026-01-31 06:36:57.859 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:57 np0005603500 systemd-machined[154375]: New machine qemu-4-instance-00000004.
Jan 31 01:36:57 np0005603500 NetworkManager[55506]: <info>  [1769841417.8641] device (tap6e13d68c-b1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:36:57 np0005603500 NetworkManager[55506]: <info>  [1769841417.8648] device (tap6e13d68c-b1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 01:36:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:57.867 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[9ff6750d-f428-440c-84fa-cb609e6a1c0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:36:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:57.868 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5bb2649b-61 in ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Jan 31 01:36:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:57.869 210946 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5bb2649b-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 31 01:36:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:57.869 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[4da9cfb0-cd50-42cb-8c73-45daf1ca5118]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:36:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:57.870 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[73c71b7b-832d-4777-a5ba-cab20907ed6f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:36:57 np0005603500 systemd[1]: Started Virtual Machine qemu-4-instance-00000004.
Jan 31 01:36:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:57.876 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[25181528-2d0f-4505-868d-b7fab1987044]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:36:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:57.886 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[d632469d-bdc5-414d-bfcc-9fec8d0c1c58]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:36:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:57.906 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[3af9279e-99b9-4244-a964-2d18108f7985]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:36:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:57.910 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[a2665169-4d58-4ce3-bc1f-064e48b6e6fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:36:57 np0005603500 NetworkManager[55506]: <info>  [1769841417.9115] manager: (tap5bb2649b-60): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Jan 31 01:36:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:57.932 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[cb8a396d-1319-41b5-bf55-8df3167c8851]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:36:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:57.934 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[87441f46-8706-44fb-93fd-a2093b7a279a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:36:57 np0005603500 NetworkManager[55506]: <info>  [1769841417.9488] device (tap5bb2649b-60): carrier: link connected
Jan 31 01:36:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:57.950 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[678dd81d-de60-4630-9b52-93272de5e642]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:36:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:57.964 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[ff16275d-33e8-4080-a9b3-4b216225fed5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5bb2649b-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a5:48:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372924, 'reachable_time': 35959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214249, 'error': None, 'target': 'ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:36:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:57.976 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[e64d7eec-f322-4b68-bb95-3f6dbd762dad]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea5:48f0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372924, 'tstamp': 372924}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214250, 'error': None, 'target': 'ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:36:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:57.988 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[be914b41-a6b8-4a5b-9ec3-915ab0d46e86]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5bb2649b-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a5:48:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372924, 'reachable_time': 35959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214251, 'error': None, 'target': 'ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:58.010 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[4d76d85e-cc62-42d8-8b74-4adce308772e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:58.051 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[573328b3-2afe-4619-802b-c94be23c2a03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:58.053 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5bb2649b-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:58.054 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:58.054 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5bb2649b-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:36:58 np0005603500 nova_compute[182934]: 2026-01-31 06:36:58.056 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:58 np0005603500 kernel: tap5bb2649b-60: entered promiscuous mode
Jan 31 01:36:58 np0005603500 NetworkManager[55506]: <info>  [1769841418.0572] manager: (tap5bb2649b-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:58.059 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5bb2649b-60, col_values=(('external_ids', {'iface-id': 'f3c2d942-7b94-4ad6-ae42-58234043864e'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:36:58 np0005603500 ovn_controller[95398]: 2026-01-31T06:36:58Z|00085|binding|INFO|Releasing lport f3c2d942-7b94-4ad6-ae42-58234043864e from this chassis (sb_readonly=0)
Jan 31 01:36:58 np0005603500 nova_compute[182934]: 2026-01-31 06:36:58.060 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:58.062 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[242e7989-a335-4d72-a821-b841a29ff47b]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:58.063 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5bb2649b-6b65-4e17-a6b3-abb539667aef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5bb2649b-6b65-4e17-a6b3-abb539667aef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:58.063 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5bb2649b-6b65-4e17-a6b3-abb539667aef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5bb2649b-6b65-4e17-a6b3-abb539667aef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:58.064 104644 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 5bb2649b-6b65-4e17-a6b3-abb539667aef disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:58.064 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5bb2649b-6b65-4e17-a6b3-abb539667aef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5bb2649b-6b65-4e17-a6b3-abb539667aef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:36:58 np0005603500 nova_compute[182934]: 2026-01-31 06:36:58.064 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:58.064 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[3d2f73ff-6db6-4337-8a77-b89fdbf683f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:58.065 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5bb2649b-6b65-4e17-a6b3-abb539667aef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5bb2649b-6b65-4e17-a6b3-abb539667aef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:58.065 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[ca6a3bcc-0bc6-4c0d-822a-1a408d334483]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:58.066 104644 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: global
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    log         /dev/log local0 debug
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    log-tag     haproxy-metadata-proxy-5bb2649b-6b65-4e17-a6b3-abb539667aef
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    user        root
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    group       root
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    maxconn     1024
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    pidfile     /var/lib/neutron/external/pids/5bb2649b-6b65-4e17-a6b3-abb539667aef.pid.haproxy
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    daemon
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: defaults
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    log global
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    mode http
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    option httplog
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    option dontlognull
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    option http-server-close
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    option forwardfor
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    retries                 3
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    timeout http-request    30s
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    timeout connect         30s
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    timeout client          32s
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    timeout server          32s
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    timeout http-keep-alive 30s
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: listen listener
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    bind 169.254.169.254:80
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]:    http-request add-header X-OVN-Network-ID 5bb2649b-6b65-4e17-a6b3-abb539667aef
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 31 01:36:58 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:36:58.067 104644 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef', 'env', 'PROCESS_TAG=haproxy-5bb2649b-6b65-4e17-a6b3-abb539667aef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5bb2649b-6b65-4e17-a6b3-abb539667aef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Jan 31 01:36:58 np0005603500 nova_compute[182934]: 2026-01-31 06:36:58.231 182938 DEBUG nova.compute.manager [req-f0d8eff8-01d6-4acf-aea4-0721d4cf2e5d req-614dda3d-882b-463b-aa65-2bf5726a01e9 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Received event network-vif-plugged-6e13d68c-b169-42b0-bafe-2026dd7b7c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:36:58 np0005603500 nova_compute[182934]: 2026-01-31 06:36:58.232 182938 DEBUG oslo_concurrency.lockutils [req-f0d8eff8-01d6-4acf-aea4-0721d4cf2e5d req-614dda3d-882b-463b-aa65-2bf5726a01e9 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "5a6339ba-548e-4964-9c45-15df2cf116b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:36:58 np0005603500 nova_compute[182934]: 2026-01-31 06:36:58.232 182938 DEBUG oslo_concurrency.lockutils [req-f0d8eff8-01d6-4acf-aea4-0721d4cf2e5d req-614dda3d-882b-463b-aa65-2bf5726a01e9 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5a6339ba-548e-4964-9c45-15df2cf116b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:36:58 np0005603500 nova_compute[182934]: 2026-01-31 06:36:58.232 182938 DEBUG oslo_concurrency.lockutils [req-f0d8eff8-01d6-4acf-aea4-0721d4cf2e5d req-614dda3d-882b-463b-aa65-2bf5726a01e9 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5a6339ba-548e-4964-9c45-15df2cf116b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:36:58 np0005603500 nova_compute[182934]: 2026-01-31 06:36:58.233 182938 DEBUG nova.compute.manager [req-f0d8eff8-01d6-4acf-aea4-0721d4cf2e5d req-614dda3d-882b-463b-aa65-2bf5726a01e9 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Processing event network-vif-plugged-6e13d68c-b169-42b0-bafe-2026dd7b7c9f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Jan 31 01:36:58 np0005603500 nova_compute[182934]: 2026-01-31 06:36:58.362 182938 DEBUG nova.compute.manager [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Jan 31 01:36:58 np0005603500 nova_compute[182934]: 2026-01-31 06:36:58.365 182938 DEBUG nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Jan 31 01:36:58 np0005603500 nova_compute[182934]: 2026-01-31 06:36:58.369 182938 INFO nova.virt.libvirt.driver [-] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Instance spawned successfully.
Jan 31 01:36:58 np0005603500 nova_compute[182934]: 2026-01-31 06:36:58.370 182938 DEBUG nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Jan 31 01:36:58 np0005603500 nova_compute[182934]: 2026-01-31 06:36:58.378 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:58 np0005603500 podman[214288]: 2026-01-31 06:36:58.388450652 +0000 UTC m=+0.045702584 container create 92f1d321f07ee99d301cf34382571b4c6dc861546b24d31d97e895a579c95f6e (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.build-date=20260127)
Jan 31 01:36:58 np0005603500 systemd[1]: Started libpod-conmon-92f1d321f07ee99d301cf34382571b4c6dc861546b24d31d97e895a579c95f6e.scope.
Jan 31 01:36:58 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:36:58 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ef85706fed5713802142090825c7e461c05f589506a56b1d3c2329cd6873547/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 01:36:58 np0005603500 podman[214288]: 2026-01-31 06:36:58.454238595 +0000 UTC m=+0.111490527 container init 92f1d321f07ee99d301cf34382571b4c6dc861546b24d31d97e895a579c95f6e (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 31 01:36:58 np0005603500 podman[214288]: 2026-01-31 06:36:58.458356184 +0000 UTC m=+0.115608116 container start 92f1d321f07ee99d301cf34382571b4c6dc861546b24d31d97e895a579c95f6e (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:36:58 np0005603500 podman[214288]: 2026-01-31 06:36:58.362516209 +0000 UTC m=+0.019768171 image pull d52ce0b189025039ce86fc9564595bcce243e95c598f912f021ea09cd4116a16 quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:36:58 np0005603500 neutron-haproxy-ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef[214308]: [NOTICE]   (214326) : New worker (214334) forked
Jan 31 01:36:58 np0005603500 neutron-haproxy-ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef[214308]: [NOTICE]   (214326) : Loading success.
Jan 31 01:36:58 np0005603500 podman[214302]: 2026-01-31 06:36:58.480630952 +0000 UTC m=+0.063768000 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 01:36:58 np0005603500 nova_compute[182934]: 2026-01-31 06:36:58.759 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:36:58 np0005603500 nova_compute[182934]: 2026-01-31 06:36:58.883 182938 DEBUG nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:36:58 np0005603500 nova_compute[182934]: 2026-01-31 06:36:58.884 182938 DEBUG nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:36:58 np0005603500 nova_compute[182934]: 2026-01-31 06:36:58.885 182938 DEBUG nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:36:58 np0005603500 nova_compute[182934]: 2026-01-31 06:36:58.885 182938 DEBUG nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:36:58 np0005603500 nova_compute[182934]: 2026-01-31 06:36:58.886 182938 DEBUG nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:36:58 np0005603500 nova_compute[182934]: 2026-01-31 06:36:58.886 182938 DEBUG nova.virt.libvirt.driver [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:36:59 np0005603500 nova_compute[182934]: 2026-01-31 06:36:59.395 182938 INFO nova.compute.manager [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Took 17.62 seconds to spawn the instance on the hypervisor.
Jan 31 01:36:59 np0005603500 nova_compute[182934]: 2026-01-31 06:36:59.396 182938 DEBUG nova.compute.manager [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Jan 31 01:36:59 np0005603500 nova_compute[182934]: 2026-01-31 06:36:59.915 182938 INFO nova.compute.manager [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Took 23.29 seconds to build instance.
Jan 31 01:37:00 np0005603500 nova_compute[182934]: 2026-01-31 06:37:00.421 182938 DEBUG oslo_concurrency.lockutils [None req-ee6befed-e0e2-4e9f-acad-b9dc63c69c6c dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "5a6339ba-548e-4964-9c45-15df2cf116b5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 24.886s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:37:00 np0005603500 nova_compute[182934]: 2026-01-31 06:37:00.492 182938 DEBUG nova.compute.manager [req-da15f3d4-e365-44d2-ad52-0503aecca193 req-a6101a31-597a-481b-a7d7-1ceedd0a671b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Received event network-vif-plugged-6e13d68c-b169-42b0-bafe-2026dd7b7c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:37:00 np0005603500 nova_compute[182934]: 2026-01-31 06:37:00.493 182938 DEBUG oslo_concurrency.lockutils [req-da15f3d4-e365-44d2-ad52-0503aecca193 req-a6101a31-597a-481b-a7d7-1ceedd0a671b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "5a6339ba-548e-4964-9c45-15df2cf116b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:37:00 np0005603500 nova_compute[182934]: 2026-01-31 06:37:00.494 182938 DEBUG oslo_concurrency.lockutils [req-da15f3d4-e365-44d2-ad52-0503aecca193 req-a6101a31-597a-481b-a7d7-1ceedd0a671b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5a6339ba-548e-4964-9c45-15df2cf116b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:37:00 np0005603500 nova_compute[182934]: 2026-01-31 06:37:00.494 182938 DEBUG oslo_concurrency.lockutils [req-da15f3d4-e365-44d2-ad52-0503aecca193 req-a6101a31-597a-481b-a7d7-1ceedd0a671b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5a6339ba-548e-4964-9c45-15df2cf116b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:37:00 np0005603500 nova_compute[182934]: 2026-01-31 06:37:00.494 182938 DEBUG nova.compute.manager [req-da15f3d4-e365-44d2-ad52-0503aecca193 req-a6101a31-597a-481b-a7d7-1ceedd0a671b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] No waiting events found dispatching network-vif-plugged-6e13d68c-b169-42b0-bafe-2026dd7b7c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:37:00 np0005603500 nova_compute[182934]: 2026-01-31 06:37:00.494 182938 WARNING nova.compute.manager [req-da15f3d4-e365-44d2-ad52-0503aecca193 req-a6101a31-597a-481b-a7d7-1ceedd0a671b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Received unexpected event network-vif-plugged-6e13d68c-b169-42b0-bafe-2026dd7b7c9f for instance with vm_state active and task_state None.
Jan 31 01:37:03 np0005603500 nova_compute[182934]: 2026-01-31 06:37:03.380 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:03 np0005603500 nova_compute[182934]: 2026-01-31 06:37:03.760 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:03 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:03.845 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:37:03 np0005603500 nova_compute[182934]: 2026-01-31 06:37:03.846 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:03 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:03.847 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:37:04 np0005603500 podman[214345]: 2026-01-31 06:37:04.126271345 +0000 UTC m=+0.047450349 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute)
Jan 31 01:37:04 np0005603500 nova_compute[182934]: 2026-01-31 06:37:04.509 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:04 np0005603500 ovn_controller[95398]: 2026-01-31T06:37:04Z|00086|binding|INFO|Releasing lport f3c2d942-7b94-4ad6-ae42-58234043864e from this chassis (sb_readonly=0)
Jan 31 01:37:04 np0005603500 NetworkManager[55506]: <info>  [1769841424.5152] manager: (patch-br-int-to-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 31 01:37:04 np0005603500 NetworkManager[55506]: <info>  [1769841424.5160] manager: (patch-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 31 01:37:04 np0005603500 nova_compute[182934]: 2026-01-31 06:37:04.519 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:04 np0005603500 ovn_controller[95398]: 2026-01-31T06:37:04Z|00087|binding|INFO|Releasing lport f3c2d942-7b94-4ad6-ae42-58234043864e from this chassis (sb_readonly=0)
Jan 31 01:37:04 np0005603500 nova_compute[182934]: 2026-01-31 06:37:04.524 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:05 np0005603500 nova_compute[182934]: 2026-01-31 06:37:05.099 182938 DEBUG nova.compute.manager [req-ec4df545-2a60-4c2e-888c-d8f5815a5407 req-ea688712-646d-40c7-9c5b-0cbc123c829c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Received event network-changed-6e13d68c-b169-42b0-bafe-2026dd7b7c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:37:05 np0005603500 nova_compute[182934]: 2026-01-31 06:37:05.099 182938 DEBUG nova.compute.manager [req-ec4df545-2a60-4c2e-888c-d8f5815a5407 req-ea688712-646d-40c7-9c5b-0cbc123c829c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Refreshing instance network info cache due to event network-changed-6e13d68c-b169-42b0-bafe-2026dd7b7c9f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:37:05 np0005603500 nova_compute[182934]: 2026-01-31 06:37:05.100 182938 DEBUG oslo_concurrency.lockutils [req-ec4df545-2a60-4c2e-888c-d8f5815a5407 req-ea688712-646d-40c7-9c5b-0cbc123c829c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-5a6339ba-548e-4964-9c45-15df2cf116b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:37:05 np0005603500 nova_compute[182934]: 2026-01-31 06:37:05.100 182938 DEBUG oslo_concurrency.lockutils [req-ec4df545-2a60-4c2e-888c-d8f5815a5407 req-ea688712-646d-40c7-9c5b-0cbc123c829c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-5a6339ba-548e-4964-9c45-15df2cf116b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:37:05 np0005603500 nova_compute[182934]: 2026-01-31 06:37:05.100 182938 DEBUG nova.network.neutron [req-ec4df545-2a60-4c2e-888c-d8f5815a5407 req-ea688712-646d-40c7-9c5b-0cbc123c829c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Refreshing network info cache for port 6e13d68c-b169-42b0-bafe-2026dd7b7c9f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:37:06 np0005603500 podman[214367]: 2026-01-31 06:37:06.128437559 +0000 UTC m=+0.047534631 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:37:08 np0005603500 nova_compute[182934]: 2026-01-31 06:37:08.382 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:08 np0005603500 nova_compute[182934]: 2026-01-31 06:37:08.775 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:08.848 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:37:09 np0005603500 ovn_controller[95398]: 2026-01-31T06:37:09Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7a:a2:11 10.100.0.10
Jan 31 01:37:09 np0005603500 ovn_controller[95398]: 2026-01-31T06:37:09Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7a:a2:11 10.100.0.10
Jan 31 01:37:09 np0005603500 nova_compute[182934]: 2026-01-31 06:37:09.425 182938 DEBUG nova.network.neutron [req-ec4df545-2a60-4c2e-888c-d8f5815a5407 req-ea688712-646d-40c7-9c5b-0cbc123c829c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Updated VIF entry in instance network info cache for port 6e13d68c-b169-42b0-bafe-2026dd7b7c9f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:37:09 np0005603500 nova_compute[182934]: 2026-01-31 06:37:09.426 182938 DEBUG nova.network.neutron [req-ec4df545-2a60-4c2e-888c-d8f5815a5407 req-ea688712-646d-40c7-9c5b-0cbc123c829c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Updating instance_info_cache with network_info: [{"id": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "address": "fa:16:3e:7a:a2:11", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e13d68c-b1", "ovs_interfaceid": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:37:09 np0005603500 nova_compute[182934]: 2026-01-31 06:37:09.933 182938 DEBUG oslo_concurrency.lockutils [req-ec4df545-2a60-4c2e-888c-d8f5815a5407 req-ea688712-646d-40c7-9c5b-0cbc123c829c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-5a6339ba-548e-4964-9c45-15df2cf116b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:37:13 np0005603500 nova_compute[182934]: 2026-01-31 06:37:13.384 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:13 np0005603500 nova_compute[182934]: 2026-01-31 06:37:13.777 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:15 np0005603500 nova_compute[182934]: 2026-01-31 06:37:15.571 182938 INFO nova.compute.manager [None req-8badbf0f-f8cd-46e6-a76e-926d3e0b6042 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Get console output
Jan 31 01:37:15 np0005603500 nova_compute[182934]: 2026-01-31 06:37:15.583 211654 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 31 01:37:18 np0005603500 nova_compute[182934]: 2026-01-31 06:37:18.386 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:18 np0005603500 nova_compute[182934]: 2026-01-31 06:37:18.413 182938 DEBUG nova.compute.manager [req-9ac3f82f-c388-46ab-9905-05051cf73df5 req-94eaa54b-2b65-47bc-9e17-bae5653cf585 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Received event network-changed-6e13d68c-b169-42b0-bafe-2026dd7b7c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:37:18 np0005603500 nova_compute[182934]: 2026-01-31 06:37:18.413 182938 DEBUG nova.compute.manager [req-9ac3f82f-c388-46ab-9905-05051cf73df5 req-94eaa54b-2b65-47bc-9e17-bae5653cf585 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Refreshing instance network info cache due to event network-changed-6e13d68c-b169-42b0-bafe-2026dd7b7c9f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:37:18 np0005603500 nova_compute[182934]: 2026-01-31 06:37:18.413 182938 DEBUG oslo_concurrency.lockutils [req-9ac3f82f-c388-46ab-9905-05051cf73df5 req-94eaa54b-2b65-47bc-9e17-bae5653cf585 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-5a6339ba-548e-4964-9c45-15df2cf116b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:37:18 np0005603500 nova_compute[182934]: 2026-01-31 06:37:18.414 182938 DEBUG oslo_concurrency.lockutils [req-9ac3f82f-c388-46ab-9905-05051cf73df5 req-94eaa54b-2b65-47bc-9e17-bae5653cf585 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-5a6339ba-548e-4964-9c45-15df2cf116b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:37:18 np0005603500 nova_compute[182934]: 2026-01-31 06:37:18.414 182938 DEBUG nova.network.neutron [req-9ac3f82f-c388-46ab-9905-05051cf73df5 req-94eaa54b-2b65-47bc-9e17-bae5653cf585 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Refreshing network info cache for port 6e13d68c-b169-42b0-bafe-2026dd7b7c9f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:37:18 np0005603500 nova_compute[182934]: 2026-01-31 06:37:18.778 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:19 np0005603500 podman[214405]: 2026-01-31 06:37:19.137518123 +0000 UTC m=+0.049127942 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 31 01:37:19 np0005603500 podman[214404]: 2026-01-31 06:37:19.168373071 +0000 UTC m=+0.079357490 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 31 01:37:23 np0005603500 nova_compute[182934]: 2026-01-31 06:37:23.388 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:23 np0005603500 nova_compute[182934]: 2026-01-31 06:37:23.584 182938 DEBUG nova.network.neutron [req-9ac3f82f-c388-46ab-9905-05051cf73df5 req-94eaa54b-2b65-47bc-9e17-bae5653cf585 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Updated VIF entry in instance network info cache for port 6e13d68c-b169-42b0-bafe-2026dd7b7c9f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:37:23 np0005603500 nova_compute[182934]: 2026-01-31 06:37:23.585 182938 DEBUG nova.network.neutron [req-9ac3f82f-c388-46ab-9905-05051cf73df5 req-94eaa54b-2b65-47bc-9e17-bae5653cf585 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Updating instance_info_cache with network_info: [{"id": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "address": "fa:16:3e:7a:a2:11", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e13d68c-b1", "ovs_interfaceid": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:37:23 np0005603500 nova_compute[182934]: 2026-01-31 06:37:23.780 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:24 np0005603500 nova_compute[182934]: 2026-01-31 06:37:24.129 182938 DEBUG oslo_concurrency.lockutils [req-9ac3f82f-c388-46ab-9905-05051cf73df5 req-94eaa54b-2b65-47bc-9e17-bae5653cf585 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-5a6339ba-548e-4964-9c45-15df2cf116b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:37:27 np0005603500 podman[214445]: 2026-01-31 06:37:27.148431505 +0000 UTC m=+0.053639093 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9/ubi-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Jan 31 01:37:28 np0005603500 nova_compute[182934]: 2026-01-31 06:37:28.390 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:28 np0005603500 nova_compute[182934]: 2026-01-31 06:37:28.782 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:29 np0005603500 nova_compute[182934]: 2026-01-31 06:37:29.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:37:29 np0005603500 nova_compute[182934]: 2026-01-31 06:37:29.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:37:29 np0005603500 podman[214467]: 2026-01-31 06:37:29.197729228 +0000 UTC m=+0.111027192 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 31 01:37:29 np0005603500 nova_compute[182934]: 2026-01-31 06:37:29.677 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:37:29 np0005603500 nova_compute[182934]: 2026-01-31 06:37:29.678 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:37:29 np0005603500 nova_compute[182934]: 2026-01-31 06:37:29.678 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:37:29 np0005603500 nova_compute[182934]: 2026-01-31 06:37:29.678 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:37:30 np0005603500 nova_compute[182934]: 2026-01-31 06:37:30.769 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:37:30 np0005603500 nova_compute[182934]: 2026-01-31 06:37:30.812 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:37:30 np0005603500 nova_compute[182934]: 2026-01-31 06:37:30.812 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:37:30 np0005603500 nova_compute[182934]: 2026-01-31 06:37:30.874 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:37:31 np0005603500 nova_compute[182934]: 2026-01-31 06:37:31.001 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:37:31 np0005603500 nova_compute[182934]: 2026-01-31 06:37:31.002 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5622MB free_disk=73.18710708618164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:37:31 np0005603500 nova_compute[182934]: 2026-01-31 06:37:31.003 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:37:31 np0005603500 nova_compute[182934]: 2026-01-31 06:37:31.003 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:37:32 np0005603500 nova_compute[182934]: 2026-01-31 06:37:32.120 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Instance 5a6339ba-548e-4964-9c45-15df2cf116b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Jan 31 01:37:32 np0005603500 nova_compute[182934]: 2026-01-31 06:37:32.120 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:37:32 np0005603500 nova_compute[182934]: 2026-01-31 06:37:32.120 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:37:32 np0005603500 nova_compute[182934]: 2026-01-31 06:37:32.156 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:37:32 np0005603500 nova_compute[182934]: 2026-01-31 06:37:32.665 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:37:33 np0005603500 nova_compute[182934]: 2026-01-31 06:37:33.177 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:37:33 np0005603500 nova_compute[182934]: 2026-01-31 06:37:33.178 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:37:33 np0005603500 nova_compute[182934]: 2026-01-31 06:37:33.392 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:33 np0005603500 nova_compute[182934]: 2026-01-31 06:37:33.784 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:35 np0005603500 nova_compute[182934]: 2026-01-31 06:37:35.115 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:37:35 np0005603500 nova_compute[182934]: 2026-01-31 06:37:35.115 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:37:35 np0005603500 podman[214501]: 2026-01-31 06:37:35.133385773 +0000 UTC m=+0.050287777 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:37:35 np0005603500 nova_compute[182934]: 2026-01-31 06:37:35.179 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:37:35 np0005603500 nova_compute[182934]: 2026-01-31 06:37:35.180 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:37:35 np0005603500 nova_compute[182934]: 2026-01-31 06:37:35.632 182938 DEBUG nova.compute.manager [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Jan 31 01:37:35 np0005603500 nova_compute[182934]: 2026-01-31 06:37:35.690 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:37:35 np0005603500 nova_compute[182934]: 2026-01-31 06:37:35.690 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:37:35 np0005603500 nova_compute[182934]: 2026-01-31 06:37:35.690 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:37:35 np0005603500 nova_compute[182934]: 2026-01-31 06:37:35.691 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:37:35 np0005603500 nova_compute[182934]: 2026-01-31 06:37:35.691 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:37:36 np0005603500 nova_compute[182934]: 2026-01-31 06:37:36.158 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:37:36 np0005603500 nova_compute[182934]: 2026-01-31 06:37:36.159 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:37:36 np0005603500 nova_compute[182934]: 2026-01-31 06:37:36.165 182938 DEBUG nova.virt.hardware [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Jan 31 01:37:36 np0005603500 nova_compute[182934]: 2026-01-31 06:37:36.165 182938 INFO nova.compute.claims [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Claim successful on node compute-0.ctlplane.example.com
Jan 31 01:37:37 np0005603500 podman[214521]: 2026-01-31 06:37:37.158713154 +0000 UTC m=+0.074990162 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 31 01:37:37 np0005603500 nova_compute[182934]: 2026-01-31 06:37:37.289 182938 DEBUG nova.compute.provider_tree [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:37:37 np0005603500 nova_compute[182934]: 2026-01-31 06:37:37.797 182938 DEBUG nova.scheduler.client.report [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:37:38 np0005603500 nova_compute[182934]: 2026-01-31 06:37:38.148 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:37:38 np0005603500 nova_compute[182934]: 2026-01-31 06:37:38.307 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:37:38 np0005603500 nova_compute[182934]: 2026-01-31 06:37:38.308 182938 DEBUG nova.compute.manager [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Jan 31 01:37:38 np0005603500 nova_compute[182934]: 2026-01-31 06:37:38.396 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:38 np0005603500 nova_compute[182934]: 2026-01-31 06:37:38.786 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:38 np0005603500 nova_compute[182934]: 2026-01-31 06:37:38.836 182938 DEBUG nova.compute.manager [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Jan 31 01:37:38 np0005603500 nova_compute[182934]: 2026-01-31 06:37:38.837 182938 DEBUG nova.network.neutron [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Jan 31 01:37:39 np0005603500 nova_compute[182934]: 2026-01-31 06:37:39.254 182938 DEBUG nova.policy [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '829310cd8381494e96216dba067ff8d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Jan 31 01:37:39 np0005603500 nova_compute[182934]: 2026-01-31 06:37:39.350 182938 INFO nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 31 01:37:39 np0005603500 nova_compute[182934]: 2026-01-31 06:37:39.948 182938 DEBUG nova.compute.manager [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Jan 31 01:37:40 np0005603500 nova_compute[182934]: 2026-01-31 06:37:40.993 182938 DEBUG nova.compute.manager [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Jan 31 01:37:40 np0005603500 nova_compute[182934]: 2026-01-31 06:37:40.995 182938 DEBUG nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Jan 31 01:37:40 np0005603500 nova_compute[182934]: 2026-01-31 06:37:40.995 182938 INFO nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Creating image(s)
Jan 31 01:37:40 np0005603500 nova_compute[182934]: 2026-01-31 06:37:40.996 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "/var/lib/nova/instances/7e5ae229-b9bd-4a48-823d-148fea52e9c6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:37:40 np0005603500 nova_compute[182934]: 2026-01-31 06:37:40.996 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/7e5ae229-b9bd-4a48-823d-148fea52e9c6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:37:40 np0005603500 nova_compute[182934]: 2026-01-31 06:37:40.997 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/7e5ae229-b9bd-4a48-823d-148fea52e9c6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:37:40 np0005603500 nova_compute[182934]: 2026-01-31 06:37:40.998 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.002 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.004 182938 DEBUG oslo_concurrency.processutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.064 182938 DEBUG oslo_concurrency.processutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.065 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "d9035e96dc857b84194c2a2b496d294827e2de39" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.066 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.066 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.070 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.071 182938 DEBUG oslo_concurrency.processutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.126 182938 DEBUG oslo_concurrency.processutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.127 182938 DEBUG oslo_concurrency.processutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/7e5ae229-b9bd-4a48-823d-148fea52e9c6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.634 182938 DEBUG oslo_concurrency.processutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/7e5ae229-b9bd-4a48-823d-148fea52e9c6/disk 1073741824" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.635 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.570s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.636 182938 DEBUG oslo_concurrency.processutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.676 182938 DEBUG oslo_concurrency.processutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.677 182938 DEBUG nova.virt.disk.api [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Checking if we can resize image /var/lib/nova/instances/7e5ae229-b9bd-4a48-823d-148fea52e9c6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.677 182938 DEBUG oslo_concurrency.processutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e5ae229-b9bd-4a48-823d-148fea52e9c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.717 182938 DEBUG oslo_concurrency.processutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e5ae229-b9bd-4a48-823d-148fea52e9c6/disk --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.718 182938 DEBUG nova.virt.disk.api [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Cannot resize image /var/lib/nova/instances/7e5ae229-b9bd-4a48-823d-148fea52e9c6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.718 182938 DEBUG nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.718 182938 DEBUG nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Ensure instance console log exists: /var/lib/nova/instances/7e5ae229-b9bd-4a48-823d-148fea52e9c6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.719 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.719 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:37:41 np0005603500 nova_compute[182934]: 2026-01-31 06:37:41.719 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:37:42 np0005603500 nova_compute[182934]: 2026-01-31 06:37:42.064 182938 DEBUG nova.network.neutron [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Successfully created port: 97f9f03f-2e42-4e6f-8298-2ff786f839a4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 31 01:37:43 np0005603500 nova_compute[182934]: 2026-01-31 06:37:43.256 182938 DEBUG nova.network.neutron [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Successfully updated port: 97f9f03f-2e42-4e6f-8298-2ff786f839a4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 31 01:37:43 np0005603500 nova_compute[182934]: 2026-01-31 06:37:43.398 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:43 np0005603500 nova_compute[182934]: 2026-01-31 06:37:43.475 182938 DEBUG nova.compute.manager [req-e5d308c4-a480-4d4a-b06f-c4ddc7684240 req-098c851a-23b0-4da1-8faa-1a98900aec43 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Received event network-changed-97f9f03f-2e42-4e6f-8298-2ff786f839a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:37:43 np0005603500 nova_compute[182934]: 2026-01-31 06:37:43.475 182938 DEBUG nova.compute.manager [req-e5d308c4-a480-4d4a-b06f-c4ddc7684240 req-098c851a-23b0-4da1-8faa-1a98900aec43 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Refreshing instance network info cache due to event network-changed-97f9f03f-2e42-4e6f-8298-2ff786f839a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:37:43 np0005603500 nova_compute[182934]: 2026-01-31 06:37:43.476 182938 DEBUG oslo_concurrency.lockutils [req-e5d308c4-a480-4d4a-b06f-c4ddc7684240 req-098c851a-23b0-4da1-8faa-1a98900aec43 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-7e5ae229-b9bd-4a48-823d-148fea52e9c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:37:43 np0005603500 nova_compute[182934]: 2026-01-31 06:37:43.476 182938 DEBUG oslo_concurrency.lockutils [req-e5d308c4-a480-4d4a-b06f-c4ddc7684240 req-098c851a-23b0-4da1-8faa-1a98900aec43 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-7e5ae229-b9bd-4a48-823d-148fea52e9c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:37:43 np0005603500 nova_compute[182934]: 2026-01-31 06:37:43.476 182938 DEBUG nova.network.neutron [req-e5d308c4-a480-4d4a-b06f-c4ddc7684240 req-098c851a-23b0-4da1-8faa-1a98900aec43 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Refreshing network info cache for port 97f9f03f-2e42-4e6f-8298-2ff786f839a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:37:43 np0005603500 nova_compute[182934]: 2026-01-31 06:37:43.762 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "refresh_cache-7e5ae229-b9bd-4a48-823d-148fea52e9c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:37:43 np0005603500 nova_compute[182934]: 2026-01-31 06:37:43.788 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:44 np0005603500 nova_compute[182934]: 2026-01-31 06:37:44.694 182938 DEBUG nova.network.neutron [req-e5d308c4-a480-4d4a-b06f-c4ddc7684240 req-098c851a-23b0-4da1-8faa-1a98900aec43 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:37:45 np0005603500 nova_compute[182934]: 2026-01-31 06:37:45.395 182938 DEBUG nova.network.neutron [req-e5d308c4-a480-4d4a-b06f-c4ddc7684240 req-098c851a-23b0-4da1-8faa-1a98900aec43 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:37:45 np0005603500 nova_compute[182934]: 2026-01-31 06:37:45.903 182938 DEBUG oslo_concurrency.lockutils [req-e5d308c4-a480-4d4a-b06f-c4ddc7684240 req-098c851a-23b0-4da1-8faa-1a98900aec43 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-7e5ae229-b9bd-4a48-823d-148fea52e9c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:37:45 np0005603500 nova_compute[182934]: 2026-01-31 06:37:45.905 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquired lock "refresh_cache-7e5ae229-b9bd-4a48-823d-148fea52e9c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:37:45 np0005603500 nova_compute[182934]: 2026-01-31 06:37:45.905 182938 DEBUG nova.network.neutron [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Jan 31 01:37:47 np0005603500 nova_compute[182934]: 2026-01-31 06:37:47.699 182938 DEBUG nova.network.neutron [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:37:48 np0005603500 nova_compute[182934]: 2026-01-31 06:37:48.400 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:48 np0005603500 nova_compute[182934]: 2026-01-31 06:37:48.790 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:50 np0005603500 podman[214564]: 2026-01-31 06:37:50.147511341 +0000 UTC m=+0.055634766 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 31 01:37:50 np0005603500 podman[214565]: 2026-01-31 06:37:50.147611374 +0000 UTC m=+0.054575323 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 01:37:50 np0005603500 nova_compute[182934]: 2026-01-31 06:37:50.694 182938 DEBUG nova.network.neutron [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Updating instance_info_cache with network_info: [{"id": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "address": "fa:16:3e:b6:50:0a", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97f9f03f-2e", "ovs_interfaceid": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.203 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Releasing lock "refresh_cache-7e5ae229-b9bd-4a48-823d-148fea52e9c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.203 182938 DEBUG nova.compute.manager [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Instance network_info: |[{"id": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "address": "fa:16:3e:b6:50:0a", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97f9f03f-2e", "ovs_interfaceid": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.206 182938 DEBUG nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Start _get_guest_xml network_info=[{"id": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "address": "fa:16:3e:b6:50:0a", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97f9f03f-2e", "ovs_interfaceid": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': '9f613975-b701-42a0-9b35-7d5c4a2cb7f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.211 182938 WARNING nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.213 182938 DEBUG nova.virt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-53419157', uuid='7e5ae229-b9bd-4a48-823d-148fea52e9c6'), owner=OwnerMeta(userid='dddc34b0385a49a5bd9bf081ed29e9fd', username='tempest-TestNetworkBasicOps-1355800406-project-member', projectid='829310cd8381494e96216dba067ff8d3', projectname='tempest-TestNetworkBasicOps-1355800406'), image=ImageMeta(id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "address": "fa:16:3e:b6:50:0a", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97f9f03f-2e", "ovs_interfaceid": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1769841471.212899) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.227 182938 DEBUG nova.virt.libvirt.host [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.228 182938 DEBUG nova.virt.libvirt.host [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.232 182938 DEBUG nova.virt.libvirt.host [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.232 182938 DEBUG nova.virt.libvirt.host [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.233 182938 DEBUG nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.233 182938 DEBUG nova.virt.hardware [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T06:29:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9956992e-a3ca-497f-9747-3ae270e07def',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.233 182938 DEBUG nova.virt.hardware [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.234 182938 DEBUG nova.virt.hardware [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.234 182938 DEBUG nova.virt.hardware [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.234 182938 DEBUG nova.virt.hardware [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.235 182938 DEBUG nova.virt.hardware [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.235 182938 DEBUG nova.virt.hardware [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.235 182938 DEBUG nova.virt.hardware [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.236 182938 DEBUG nova.virt.hardware [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.236 182938 DEBUG nova.virt.hardware [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.236 182938 DEBUG nova.virt.hardware [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.241 182938 DEBUG nova.virt.libvirt.vif [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:37:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-53419157',display_name='tempest-TestNetworkBasicOps-server-53419157',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-53419157',id=5,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAKfLScnFXqsXUO9QdQDilfhmO2jUXno7JwAO2cYVJSjA/SOenKorDCHwJ1PH/2Dc22k4Os9Y1gx8ERaVKGHXX+bmNTih2wiYTOyMgfqtxdjDxvayWS/7JgxB1VU+XIhlg==',key_name='tempest-TestNetworkBasicOps-919862997',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-rvu2s6ok',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:37:40Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=7e5ae229-b9bd-4a48-823d-148fea52e9c6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "address": "fa:16:3e:b6:50:0a", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97f9f03f-2e", "ovs_interfaceid": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.242 182938 DEBUG nova.network.os_vif_util [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "address": "fa:16:3e:b6:50:0a", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97f9f03f-2e", "ovs_interfaceid": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.243 182938 DEBUG nova.network.os_vif_util [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:50:0a,bridge_name='br-int',has_traffic_filtering=True,id=97f9f03f-2e42-4e6f-8298-2ff786f839a4,network=Network(5bb2649b-6b65-4e17-a6b3-abb539667aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97f9f03f-2e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.244 182938 DEBUG nova.objects.instance [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7e5ae229-b9bd-4a48-823d-148fea52e9c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.752 182938 DEBUG nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] End _get_guest_xml xml=<domain type="kvm">
Jan 31 01:37:51 np0005603500 nova_compute[182934]:  <uuid>7e5ae229-b9bd-4a48-823d-148fea52e9c6</uuid>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:  <name>instance-00000005</name>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:  <memory>131072</memory>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:  <vcpu>1</vcpu>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <nova:name>tempest-TestNetworkBasicOps-server-53419157</nova:name>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <nova:creationTime>2026-01-31 06:37:51</nova:creationTime>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <nova:flavor name="m1.nano">
Jan 31 01:37:51 np0005603500 nova_compute[182934]:        <nova:memory>128</nova:memory>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:        <nova:disk>1</nova:disk>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:        <nova:swap>0</nova:swap>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:        <nova:vcpus>1</nova:vcpus>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      </nova:flavor>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <nova:owner>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:        <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:        <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      </nova:owner>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <nova:ports>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:        <nova:port uuid="97f9f03f-2e42-4e6f-8298-2ff786f839a4">
Jan 31 01:37:51 np0005603500 nova_compute[182934]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:        </nova:port>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      </nova:ports>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    </nova:instance>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:  <sysinfo type="smbios">
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <entry name="manufacturer">RDO</entry>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <entry name="product">OpenStack Compute</entry>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <entry name="serial">7e5ae229-b9bd-4a48-823d-148fea52e9c6</entry>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <entry name="uuid">7e5ae229-b9bd-4a48-823d-148fea52e9c6</entry>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <entry name="family">Virtual Machine</entry>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <boot dev="hd"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <smbios mode="sysinfo"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <vmcoreinfo/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:  <clock offset="utc">
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <timer name="hpet" present="no"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:  <cpu mode="host-model" match="exact">
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <disk type="file" device="disk">
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/7e5ae229-b9bd-4a48-823d-148fea52e9c6/disk"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <target dev="vda" bus="virtio"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <disk type="file" device="cdrom">
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <driver name="qemu" type="raw" cache="none"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/7e5ae229-b9bd-4a48-823d-148fea52e9c6/disk.config"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <target dev="sda" bus="sata"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <interface type="ethernet">
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <mac address="fa:16:3e:b6:50:0a"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <mtu size="1442"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <target dev="tap97f9f03f-2e"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <serial type="pty">
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <log file="/var/lib/nova/instances/7e5ae229-b9bd-4a48-823d-148fea52e9c6/console.log" append="off"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <input type="tablet" bus="usb"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <rng model="virtio">
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <backend model="random">/dev/urandom</backend>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <controller type="usb" index="0"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    <memballoon model="virtio">
Jan 31 01:37:51 np0005603500 nova_compute[182934]:      <stats period="10"/>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:37:51 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:37:51 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:37:51 np0005603500 nova_compute[182934]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.753 182938 DEBUG nova.compute.manager [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Preparing to wait for external event network-vif-plugged-97f9f03f-2e42-4e6f-8298-2ff786f839a4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.753 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.754 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.754 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.754 182938 DEBUG nova.virt.libvirt.vif [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:37:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-53419157',display_name='tempest-TestNetworkBasicOps-server-53419157',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-53419157',id=5,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAKfLScnFXqsXUO9QdQDilfhmO2jUXno7JwAO2cYVJSjA/SOenKorDCHwJ1PH/2Dc22k4Os9Y1gx8ERaVKGHXX+bmNTih2wiYTOyMgfqtxdjDxvayWS/7JgxB1VU+XIhlg==',key_name='tempest-TestNetworkBasicOps-919862997',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-rvu2s6ok',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:37:40Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=7e5ae229-b9bd-4a48-823d-148fea52e9c6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "address": "fa:16:3e:b6:50:0a", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97f9f03f-2e", "ovs_interfaceid": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.755 182938 DEBUG nova.network.os_vif_util [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "address": "fa:16:3e:b6:50:0a", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97f9f03f-2e", "ovs_interfaceid": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.755 182938 DEBUG nova.network.os_vif_util [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:50:0a,bridge_name='br-int',has_traffic_filtering=True,id=97f9f03f-2e42-4e6f-8298-2ff786f839a4,network=Network(5bb2649b-6b65-4e17-a6b3-abb539667aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97f9f03f-2e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.755 182938 DEBUG os_vif [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:50:0a,bridge_name='br-int',has_traffic_filtering=True,id=97f9f03f-2e42-4e6f-8298-2ff786f839a4,network=Network(5bb2649b-6b65-4e17-a6b3-abb539667aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97f9f03f-2e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.756 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.756 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.757 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.757 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.758 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '0aef102e-dc92-587c-b5b6-2ce1c155321b', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.759 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.760 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.763 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.763 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap97f9f03f-2e, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.763 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap97f9f03f-2e, col_values=(('qos', UUID('2cc7adfb-051f-486c-a47c-b7a2fae710b2')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.763 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap97f9f03f-2e, col_values=(('external_ids', {'iface-id': '97f9f03f-2e42-4e6f-8298-2ff786f839a4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b6:50:0a', 'vm-uuid': '7e5ae229-b9bd-4a48-823d-148fea52e9c6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.764 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:51 np0005603500 NetworkManager[55506]: <info>  [1769841471.7664] manager: (tap97f9f03f-2e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.767 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.770 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:51 np0005603500 nova_compute[182934]: 2026-01-31 06:37:51.770 182938 INFO os_vif [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:50:0a,bridge_name='br-int',has_traffic_filtering=True,id=97f9f03f-2e42-4e6f-8298-2ff786f839a4,network=Network(5bb2649b-6b65-4e17-a6b3-abb539667aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97f9f03f-2e')
Jan 31 01:37:53 np0005603500 nova_compute[182934]: 2026-01-31 06:37:53.394 182938 DEBUG nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:37:53 np0005603500 nova_compute[182934]: 2026-01-31 06:37:53.394 182938 DEBUG nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:37:53 np0005603500 nova_compute[182934]: 2026-01-31 06:37:53.394 182938 DEBUG nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No VIF found with MAC fa:16:3e:b6:50:0a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Jan 31 01:37:53 np0005603500 nova_compute[182934]: 2026-01-31 06:37:53.395 182938 INFO nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Using config drive
Jan 31 01:37:53 np0005603500 nova_compute[182934]: 2026-01-31 06:37:53.402 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:54 np0005603500 nova_compute[182934]: 2026-01-31 06:37:54.900 182938 INFO nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Creating config drive at /var/lib/nova/instances/7e5ae229-b9bd-4a48-823d-148fea52e9c6/disk.config
Jan 31 01:37:54 np0005603500 nova_compute[182934]: 2026-01-31 06:37:54.904 182938 DEBUG oslo_concurrency.processutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7e5ae229-b9bd-4a48-823d-148fea52e9c6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpafl9o977 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:37:55 np0005603500 nova_compute[182934]: 2026-01-31 06:37:55.031 182938 DEBUG oslo_concurrency.processutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7e5ae229-b9bd-4a48-823d-148fea52e9c6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpafl9o977" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:37:55 np0005603500 kernel: tap97f9f03f-2e: entered promiscuous mode
Jan 31 01:37:55 np0005603500 NetworkManager[55506]: <info>  [1769841475.0697] manager: (tap97f9f03f-2e): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Jan 31 01:37:55 np0005603500 ovn_controller[95398]: 2026-01-31T06:37:55Z|00088|binding|INFO|Claiming lport 97f9f03f-2e42-4e6f-8298-2ff786f839a4 for this chassis.
Jan 31 01:37:55 np0005603500 ovn_controller[95398]: 2026-01-31T06:37:55Z|00089|binding|INFO|97f9f03f-2e42-4e6f-8298-2ff786f839a4: Claiming fa:16:3e:b6:50:0a 10.100.0.12
Jan 31 01:37:55 np0005603500 nova_compute[182934]: 2026-01-31 06:37:55.071 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:55 np0005603500 ovn_controller[95398]: 2026-01-31T06:37:55Z|00090|binding|INFO|Setting lport 97f9f03f-2e42-4e6f-8298-2ff786f839a4 ovn-installed in OVS
Jan 31 01:37:55 np0005603500 nova_compute[182934]: 2026-01-31 06:37:55.077 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:55 np0005603500 nova_compute[182934]: 2026-01-31 06:37:55.078 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:55 np0005603500 ovn_controller[95398]: 2026-01-31T06:37:55Z|00091|binding|INFO|Setting lport 97f9f03f-2e42-4e6f-8298-2ff786f839a4 up in Southbound
Jan 31 01:37:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:55.083 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:50:0a 10.100.0.12'], port_security=['fa:16:3e:b6:50:0a 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '7e5ae229-b9bd-4a48-823d-148fea52e9c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bb2649b-6b65-4e17-a6b3-abb539667aef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2eeeb684-4192-478e-8ce8-fc5a96ad0662', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29d6cd86-2fe1-46f4-ab6e-26f3373754c7, chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=97f9f03f-2e42-4e6f-8298-2ff786f839a4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:37:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:55.086 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 97f9f03f-2e42-4e6f-8298-2ff786f839a4 in datapath 5bb2649b-6b65-4e17-a6b3-abb539667aef bound to our chassis
Jan 31 01:37:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:55.088 104644 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5bb2649b-6b65-4e17-a6b3-abb539667aef
Jan 31 01:37:55 np0005603500 systemd-udevd[214622]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:37:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:55.103 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[c764c9e9-ec03-448e-b110-64e2c04a6dfd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:37:55 np0005603500 systemd-machined[154375]: New machine qemu-5-instance-00000005.
Jan 31 01:37:55 np0005603500 NetworkManager[55506]: <info>  [1769841475.1093] device (tap97f9f03f-2e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:37:55 np0005603500 NetworkManager[55506]: <info>  [1769841475.1100] device (tap97f9f03f-2e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 01:37:55 np0005603500 systemd[1]: Started Virtual Machine qemu-5-instance-00000005.
Jan 31 01:37:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:55.130 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[df90826c-2c1f-4518-a9ef-6b324934ae0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:37:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:55.134 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[6d9c8cd2-dfbe-45e8-85eb-b3d2cba53ded]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:37:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:55.159 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[9c48565e-12ab-4dba-943b-e56c6ff2c998]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:37:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:55.178 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[bd65017e-1e43-4262-a546-8d523941e986]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5bb2649b-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a5:48:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372924, 'reachable_time': 35959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214635, 'error': None, 'target': 'ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:37:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:55.204 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[dc4fe65e-2d30-48f9-b9e4-de89881233c3]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap5bb2649b-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372932, 'tstamp': 372932}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214636, 'error': None, 'target': 'ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap5bb2649b-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372934, 'tstamp': 372934}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214636, 'error': None, 'target': 'ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:37:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:55.206 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5bb2649b-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:37:55 np0005603500 nova_compute[182934]: 2026-01-31 06:37:55.208 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:55 np0005603500 nova_compute[182934]: 2026-01-31 06:37:55.209 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:55.210 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5bb2649b-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:37:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:55.210 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:37:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:55.210 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5bb2649b-60, col_values=(('external_ids', {'iface-id': 'f3c2d942-7b94-4ad6-ae42-58234043864e'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:37:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:55.211 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:37:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:55.212 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[ccb87533-4690-4665-af26-154c1a958f77]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-5bb2649b-6b65-4e17-a6b3-abb539667aef\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/5bb2649b-6b65-4e17-a6b3-abb539667aef.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 5bb2649b-6b65-4e17-a6b3-abb539667aef\n') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:37:55 np0005603500 nova_compute[182934]: 2026-01-31 06:37:55.414 182938 DEBUG nova.compute.manager [req-bcbdfc56-1758-429e-996a-393f630afd69 req-31e634a7-30fa-4406-8c8b-134b8e57d19b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Received event network-vif-plugged-97f9f03f-2e42-4e6f-8298-2ff786f839a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:37:55 np0005603500 nova_compute[182934]: 2026-01-31 06:37:55.415 182938 DEBUG oslo_concurrency.lockutils [req-bcbdfc56-1758-429e-996a-393f630afd69 req-31e634a7-30fa-4406-8c8b-134b8e57d19b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:37:55 np0005603500 nova_compute[182934]: 2026-01-31 06:37:55.415 182938 DEBUG oslo_concurrency.lockutils [req-bcbdfc56-1758-429e-996a-393f630afd69 req-31e634a7-30fa-4406-8c8b-134b8e57d19b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:37:55 np0005603500 nova_compute[182934]: 2026-01-31 06:37:55.416 182938 DEBUG oslo_concurrency.lockutils [req-bcbdfc56-1758-429e-996a-393f630afd69 req-31e634a7-30fa-4406-8c8b-134b8e57d19b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:37:55 np0005603500 nova_compute[182934]: 2026-01-31 06:37:55.416 182938 DEBUG nova.compute.manager [req-bcbdfc56-1758-429e-996a-393f630afd69 req-31e634a7-30fa-4406-8c8b-134b8e57d19b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Processing event network-vif-plugged-97f9f03f-2e42-4e6f-8298-2ff786f839a4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Jan 31 01:37:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:55.838 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:37:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:55.839 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:37:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:37:55.840 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:37:56 np0005603500 nova_compute[182934]: 2026-01-31 06:37:56.310 182938 DEBUG nova.compute.manager [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Jan 31 01:37:56 np0005603500 nova_compute[182934]: 2026-01-31 06:37:56.314 182938 DEBUG nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Jan 31 01:37:56 np0005603500 nova_compute[182934]: 2026-01-31 06:37:56.317 182938 INFO nova.virt.libvirt.driver [-] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Instance spawned successfully.
Jan 31 01:37:56 np0005603500 nova_compute[182934]: 2026-01-31 06:37:56.317 182938 DEBUG nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Jan 31 01:37:56 np0005603500 nova_compute[182934]: 2026-01-31 06:37:56.766 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:37:56 np0005603500 nova_compute[182934]: 2026-01-31 06:37:56.834 182938 DEBUG nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:37:56 np0005603500 nova_compute[182934]: 2026-01-31 06:37:56.835 182938 DEBUG nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:37:56 np0005603500 nova_compute[182934]: 2026-01-31 06:37:56.835 182938 DEBUG nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:37:56 np0005603500 nova_compute[182934]: 2026-01-31 06:37:56.836 182938 DEBUG nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:37:56 np0005603500 nova_compute[182934]: 2026-01-31 06:37:56.836 182938 DEBUG nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:37:56 np0005603500 nova_compute[182934]: 2026-01-31 06:37:56.836 182938 DEBUG nova.virt.libvirt.driver [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:37:57 np0005603500 nova_compute[182934]: 2026-01-31 06:37:57.346 182938 INFO nova.compute.manager [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Took 16.35 seconds to spawn the instance on the hypervisor.
Jan 31 01:37:57 np0005603500 nova_compute[182934]: 2026-01-31 06:37:57.346 182938 DEBUG nova.compute.manager [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Jan 31 01:37:57 np0005603500 nova_compute[182934]: 2026-01-31 06:37:57.619 182938 DEBUG nova.compute.manager [req-c3a7968f-44d5-4d3e-82aa-1ff98443c359 req-d37f7e86-1637-4b04-b2e1-63d8de616ad6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Received event network-vif-plugged-97f9f03f-2e42-4e6f-8298-2ff786f839a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:37:57 np0005603500 nova_compute[182934]: 2026-01-31 06:37:57.620 182938 DEBUG oslo_concurrency.lockutils [req-c3a7968f-44d5-4d3e-82aa-1ff98443c359 req-d37f7e86-1637-4b04-b2e1-63d8de616ad6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:37:57 np0005603500 nova_compute[182934]: 2026-01-31 06:37:57.620 182938 DEBUG oslo_concurrency.lockutils [req-c3a7968f-44d5-4d3e-82aa-1ff98443c359 req-d37f7e86-1637-4b04-b2e1-63d8de616ad6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:37:57 np0005603500 nova_compute[182934]: 2026-01-31 06:37:57.620 182938 DEBUG oslo_concurrency.lockutils [req-c3a7968f-44d5-4d3e-82aa-1ff98443c359 req-d37f7e86-1637-4b04-b2e1-63d8de616ad6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:37:57 np0005603500 nova_compute[182934]: 2026-01-31 06:37:57.621 182938 DEBUG nova.compute.manager [req-c3a7968f-44d5-4d3e-82aa-1ff98443c359 req-d37f7e86-1637-4b04-b2e1-63d8de616ad6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] No waiting events found dispatching network-vif-plugged-97f9f03f-2e42-4e6f-8298-2ff786f839a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:37:57 np0005603500 nova_compute[182934]: 2026-01-31 06:37:57.621 182938 WARNING nova.compute.manager [req-c3a7968f-44d5-4d3e-82aa-1ff98443c359 req-d37f7e86-1637-4b04-b2e1-63d8de616ad6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Received unexpected event network-vif-plugged-97f9f03f-2e42-4e6f-8298-2ff786f839a4 for instance with vm_state active and task_state None.
Jan 31 01:37:57 np0005603500 nova_compute[182934]: 2026-01-31 06:37:57.867 182938 INFO nova.compute.manager [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Took 21.72 seconds to build instance.
Jan 31 01:37:58 np0005603500 podman[214646]: 2026-01-31 06:37:58.176734497 +0000 UTC m=+0.087538036 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1769056855, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, distribution-scope=public)
Jan 31 01:37:58 np0005603500 nova_compute[182934]: 2026-01-31 06:37:58.373 182938 DEBUG oslo_concurrency.lockutils [None req-b1ce76b0-5742-4c94-9999-54104d926076 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 23.257s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:37:58 np0005603500 nova_compute[182934]: 2026-01-31 06:37:58.404 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:00 np0005603500 podman[214667]: 2026-01-31 06:38:00.154555978 +0000 UTC m=+0.077677456 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:38:01 np0005603500 nova_compute[182934]: 2026-01-31 06:38:01.770 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:02 np0005603500 nova_compute[182934]: 2026-01-31 06:38:02.998 182938 DEBUG nova.compute.manager [req-d7de8334-981b-4b03-b776-da90f0ed176e req-25306508-c74d-48c2-a9b5-1c4e30a8989b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Received event network-changed-97f9f03f-2e42-4e6f-8298-2ff786f839a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:38:03 np0005603500 nova_compute[182934]: 2026-01-31 06:38:02.999 182938 DEBUG nova.compute.manager [req-d7de8334-981b-4b03-b776-da90f0ed176e req-25306508-c74d-48c2-a9b5-1c4e30a8989b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Refreshing instance network info cache due to event network-changed-97f9f03f-2e42-4e6f-8298-2ff786f839a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:38:03 np0005603500 nova_compute[182934]: 2026-01-31 06:38:02.999 182938 DEBUG oslo_concurrency.lockutils [req-d7de8334-981b-4b03-b776-da90f0ed176e req-25306508-c74d-48c2-a9b5-1c4e30a8989b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-7e5ae229-b9bd-4a48-823d-148fea52e9c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:38:03 np0005603500 nova_compute[182934]: 2026-01-31 06:38:02.999 182938 DEBUG oslo_concurrency.lockutils [req-d7de8334-981b-4b03-b776-da90f0ed176e req-25306508-c74d-48c2-a9b5-1c4e30a8989b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-7e5ae229-b9bd-4a48-823d-148fea52e9c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:38:03 np0005603500 nova_compute[182934]: 2026-01-31 06:38:02.999 182938 DEBUG nova.network.neutron [req-d7de8334-981b-4b03-b776-da90f0ed176e req-25306508-c74d-48c2-a9b5-1c4e30a8989b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Refreshing network info cache for port 97f9f03f-2e42-4e6f-8298-2ff786f839a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:38:03 np0005603500 nova_compute[182934]: 2026-01-31 06:38:03.408 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:06 np0005603500 podman[214695]: 2026-01-31 06:38:06.143940818 +0000 UTC m=+0.061936343 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 01:38:06 np0005603500 nova_compute[182934]: 2026-01-31 06:38:06.773 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:07 np0005603500 nova_compute[182934]: 2026-01-31 06:38:07.717 182938 DEBUG nova.network.neutron [req-d7de8334-981b-4b03-b776-da90f0ed176e req-25306508-c74d-48c2-a9b5-1c4e30a8989b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Updated VIF entry in instance network info cache for port 97f9f03f-2e42-4e6f-8298-2ff786f839a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:38:07 np0005603500 nova_compute[182934]: 2026-01-31 06:38:07.718 182938 DEBUG nova.network.neutron [req-d7de8334-981b-4b03-b776-da90f0ed176e req-25306508-c74d-48c2-a9b5-1c4e30a8989b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Updating instance_info_cache with network_info: [{"id": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "address": "fa:16:3e:b6:50:0a", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97f9f03f-2e", "ovs_interfaceid": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:38:08 np0005603500 nova_compute[182934]: 2026-01-31 06:38:08.230 182938 DEBUG oslo_concurrency.lockutils [req-d7de8334-981b-4b03-b776-da90f0ed176e req-25306508-c74d-48c2-a9b5-1c4e30a8989b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-7e5ae229-b9bd-4a48-823d-148fea52e9c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:38:08 np0005603500 podman[214717]: 2026-01-31 06:38:08.281144518 +0000 UTC m=+0.083007764 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 31 01:38:08 np0005603500 nova_compute[182934]: 2026-01-31 06:38:08.410 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:11 np0005603500 ovn_controller[95398]: 2026-01-31T06:38:11Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b6:50:0a 10.100.0.12
Jan 31 01:38:11 np0005603500 ovn_controller[95398]: 2026-01-31T06:38:11Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b6:50:0a 10.100.0.12
Jan 31 01:38:11 np0005603500 nova_compute[182934]: 2026-01-31 06:38:11.776 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:13 np0005603500 nova_compute[182934]: 2026-01-31 06:38:13.412 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:16 np0005603500 nova_compute[182934]: 2026-01-31 06:38:16.779 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:17 np0005603500 nova_compute[182934]: 2026-01-31 06:38:17.560 182938 INFO nova.compute.manager [None req-9ba64386-9b95-4f50-aad7-df0bda1fc94b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Get console output
Jan 31 01:38:17 np0005603500 nova_compute[182934]: 2026-01-31 06:38:17.567 211654 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 31 01:38:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:17.982 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f199f44dcd0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:17.986 16 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/servers/7e5ae229-b9bd-4a48-823d-148fea52e9c6 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}9de33c3c4c813c7413c734743528a34030291a616c281269e5092e293b0fad44" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:580
Jan 31 01:38:18 np0005603500 nova_compute[182934]: 2026-01-31 06:38:18.414 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:18 np0005603500 nova_compute[182934]: 2026-01-31 06:38:18.968 182938 DEBUG oslo_concurrency.lockutils [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:38:18 np0005603500 nova_compute[182934]: 2026-01-31 06:38:18.970 182938 DEBUG oslo_concurrency.lockutils [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:38:18 np0005603500 nova_compute[182934]: 2026-01-31 06:38:18.970 182938 DEBUG oslo_concurrency.lockutils [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:38:18 np0005603500 nova_compute[182934]: 2026-01-31 06:38:18.970 182938 DEBUG oslo_concurrency.lockutils [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:38:18 np0005603500 nova_compute[182934]: 2026-01-31 06:38:18.971 182938 DEBUG oslo_concurrency.lockutils [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:38:18 np0005603500 nova_compute[182934]: 2026-01-31 06:38:18.973 182938 INFO nova.compute.manager [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Terminating instance
Jan 31 01:38:19 np0005603500 nova_compute[182934]: 2026-01-31 06:38:19.483 182938 DEBUG nova.compute.manager [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Jan 31 01:38:19 np0005603500 kernel: tap97f9f03f-2e (unregistering): left promiscuous mode
Jan 31 01:38:19 np0005603500 NetworkManager[55506]: <info>  [1769841499.5370] device (tap97f9f03f-2e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 01:38:19 np0005603500 nova_compute[182934]: 2026-01-31 06:38:19.542 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:19 np0005603500 ovn_controller[95398]: 2026-01-31T06:38:19Z|00092|binding|INFO|Releasing lport 97f9f03f-2e42-4e6f-8298-2ff786f839a4 from this chassis (sb_readonly=0)
Jan 31 01:38:19 np0005603500 ovn_controller[95398]: 2026-01-31T06:38:19Z|00093|binding|INFO|Setting lport 97f9f03f-2e42-4e6f-8298-2ff786f839a4 down in Southbound
Jan 31 01:38:19 np0005603500 ovn_controller[95398]: 2026-01-31T06:38:19Z|00094|binding|INFO|Removing iface tap97f9f03f-2e ovn-installed in OVS
Jan 31 01:38:19 np0005603500 nova_compute[182934]: 2026-01-31 06:38:19.548 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:19.550 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:50:0a 10.100.0.12'], port_security=['fa:16:3e:b6:50:0a 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '7e5ae229-b9bd-4a48-823d-148fea52e9c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bb2649b-6b65-4e17-a6b3-abb539667aef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '2eeeb684-4192-478e-8ce8-fc5a96ad0662', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.201'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29d6cd86-2fe1-46f4-ab6e-26f3373754c7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=97f9f03f-2e42-4e6f-8298-2ff786f839a4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:38:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:19.551 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 97f9f03f-2e42-4e6f-8298-2ff786f839a4 in datapath 5bb2649b-6b65-4e17-a6b3-abb539667aef unbound from our chassis
Jan 31 01:38:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:19.553 104644 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5bb2649b-6b65-4e17-a6b3-abb539667aef
Jan 31 01:38:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:19.564 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[768444b2-5658-4df1-a37d-9f3b949b0b1e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:38:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:19.581 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[559a2e69-7a4e-469c-9003-5fe6b17724ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:38:19 np0005603500 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Deactivated successfully.
Jan 31 01:38:19 np0005603500 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Consumed 14.983s CPU time.
Jan 31 01:38:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:19.584 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[73c8e8e2-675d-4c30-9b94-d45089e555bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:38:19 np0005603500 systemd-machined[154375]: Machine qemu-5-instance-00000005 terminated.
Jan 31 01:38:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:19.600 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[3922f865-fa00-4306-94d0-1c7a31bcfc6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:38:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:19.603 16 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1974 Content-Type: application/json Date: Sat, 31 Jan 2026 06:38:18 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-e0b8d5e4-76b9-4783-b50b-1a46d2ee8244 x-openstack-request-id: req-e0b8d5e4-76b9-4783-b50b-1a46d2ee8244 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:621
Jan 31 01:38:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:19.603 16 DEBUG novaclient.v2.client [-] RESP BODY: {"server": {"id": "7e5ae229-b9bd-4a48-823d-148fea52e9c6", "name": "tempest-TestNetworkBasicOps-server-53419157", "status": "ACTIVE", "tenant_id": "829310cd8381494e96216dba067ff8d3", "user_id": "dddc34b0385a49a5bd9bf081ed29e9fd", "metadata": {}, "hostId": "0c6cfdf0627941602de15b61ce73eab761f53a1d9b2a5d92c8bbcc8e", "image": {"id": "9f613975-b701-42a0-9b35-7d5c4a2cb7f2", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/images/9f613975-b701-42a0-9b35-7d5c4a2cb7f2"}]}, "flavor": {"id": "9956992e-a3ca-497f-9747-3ae270e07def", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/9956992e-a3ca-497f-9747-3ae270e07def"}]}, "created": "2026-01-31T06:37:33Z", "updated": "2026-01-31T06:37:57Z", "addresses": {"tempest-network-smoke--1051141805": [{"version": 4, "addr": "10.100.0.12", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:b6:50:0a"}, {"version": 4, "addr": "192.168.122.201", "OS-EXT-IPS:type": "floating", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:b6:50:0a"}]}, "accessIPv4": "", "accessIPv6": "", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/servers/7e5ae229-b9bd-4a48-823d-148fea52e9c6"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/servers/7e5ae229-b9bd-4a48-823d-148fea52e9c6"}], "OS-DCF:diskConfig": "MANUAL", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "True", "key_name": "tempest-TestNetworkBasicOps-919862997", "OS-SRV-USG:launched_at": "2026-01-31T06:37:57.000000", "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "tempest-secgroup-smoke-2119119205"}], "OS-EXT-SRV-ATTR:host": "compute-0.ctlplane.example.com", "OS-EXT-SRV-ATTR:instance_name": "instance-00000005", "OS-EXT-SRV-ATTR:hypervisor_hostname": "compute-0.ctlplane.example.com", "OS-EXT-STS:task_state": null, "OS-EXT-STS:vm_state": "active", "OS-EXT-STS:power_state": 1, "os-extended-volumes:volumes_attached": []}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:656
Jan 31 01:38:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:19.604 16 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/servers/7e5ae229-b9bd-4a48-823d-148fea52e9c6 used request id req-e0b8d5e4-76b9-4783-b50b-1a46d2ee8244 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:1081
Jan 31 01:38:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:19.611 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[c57a13d6-722e-40a1-a282-400c1d0d7683]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5bb2649b-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a5:48:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372924, 'reachable_time': 35959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214782, 'error': None, 'target': 'ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:38:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:19.619 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[c7ec3b69-b3a1-4864-aad3-378c5e74ec49]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap5bb2649b-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372932, 'tstamp': 372932}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214783, 'error': None, 'target': 'ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap5bb2649b-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372934, 'tstamp': 372934}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214783, 'error': None, 'target': 'ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:38:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:19.620 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5bb2649b-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:38:19 np0005603500 nova_compute[182934]: 2026-01-31 06:38:19.621 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:19 np0005603500 nova_compute[182934]: 2026-01-31 06:38:19.624 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:19.625 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5bb2649b-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:38:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:19.625 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:38:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:19.625 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5bb2649b-60, col_values=(('external_ids', {'iface-id': 'f3c2d942-7b94-4ad6-ae42-58234043864e'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:38:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:19.625 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:38:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:19.626 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[fcf569cf-c2c3-437c-907a-03b1bfffe0e5]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-5bb2649b-6b65-4e17-a6b3-abb539667aef\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/5bb2649b-6b65-4e17-a6b3-abb539667aef.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 5bb2649b-6b65-4e17-a6b3-abb539667aef\n') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:38:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:19.689 16 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7e5ae229-b9bd-4a48-823d-148fea52e9c6', 'name': 'tempest-TestNetworkBasicOps-server-53419157', 'flavor': {'id': '9956992e-a3ca-497f-9747-3ae270e07def', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9f613975-b701-42a0-9b35-7d5c4a2cb7f2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000005', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '829310cd8381494e96216dba067ff8d3', 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'hostId': '0c6cfdf0627941602de15b61ce73eab761f53a1d9b2a5d92c8bbcc8e', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:226
Jan 31 01:38:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:19.695 16 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/servers/5a6339ba-548e-4964-9c45-15df2cf116b5 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}9de33c3c4c813c7413c734743528a34030291a616c281269e5092e293b0fad44" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:580
Jan 31 01:38:19 np0005603500 nova_compute[182934]: 2026-01-31 06:38:19.727 182938 INFO nova.virt.libvirt.driver [-] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Instance destroyed successfully.
Jan 31 01:38:19 np0005603500 nova_compute[182934]: 2026-01-31 06:38:19.727 182938 DEBUG nova.objects.instance [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'resources' on Instance uuid 7e5ae229-b9bd-4a48-823d-148fea52e9c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:38:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:19.864 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:38:19 np0005603500 nova_compute[182934]: 2026-01-31 06:38:19.865 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:19.865 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:38:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:19.866 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.246 182938 DEBUG nova.virt.libvirt.vif [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:37:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-53419157',display_name='tempest-TestNetworkBasicOps-server-53419157',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-53419157',id=5,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAKfLScnFXqsXUO9QdQDilfhmO2jUXno7JwAO2cYVJSjA/SOenKorDCHwJ1PH/2Dc22k4Os9Y1gx8ERaVKGHXX+bmNTih2wiYTOyMgfqtxdjDxvayWS/7JgxB1VU+XIhlg==',key_name='tempest-TestNetworkBasicOps-919862997',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:37:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-rvu2s6ok',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:37:57Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=7e5ae229-b9bd-4a48-823d-148fea52e9c6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "address": "fa:16:3e:b6:50:0a", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97f9f03f-2e", "ovs_interfaceid": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.246 182938 DEBUG nova.network.os_vif_util [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "address": "fa:16:3e:b6:50:0a", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97f9f03f-2e", "ovs_interfaceid": "97f9f03f-2e42-4e6f-8298-2ff786f839a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.247 182938 DEBUG nova.network.os_vif_util [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b6:50:0a,bridge_name='br-int',has_traffic_filtering=True,id=97f9f03f-2e42-4e6f-8298-2ff786f839a4,network=Network(5bb2649b-6b65-4e17-a6b3-abb539667aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97f9f03f-2e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.247 182938 DEBUG os_vif [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b6:50:0a,bridge_name='br-int',has_traffic_filtering=True,id=97f9f03f-2e42-4e6f-8298-2ff786f839a4,network=Network(5bb2649b-6b65-4e17-a6b3-abb539667aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97f9f03f-2e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.249 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.250 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap97f9f03f-2e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.251 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.253 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.254 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.254 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=2cc7adfb-051f-486c-a47c-b7a2fae710b2) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.255 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.256 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.258 182938 INFO os_vif [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b6:50:0a,bridge_name='br-int',has_traffic_filtering=True,id=97f9f03f-2e42-4e6f-8298-2ff786f839a4,network=Network(5bb2649b-6b65-4e17-a6b3-abb539667aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97f9f03f-2e')
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.259 182938 INFO nova.virt.libvirt.driver [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Deleting instance files /var/lib/nova/instances/7e5ae229-b9bd-4a48-823d-148fea52e9c6_del
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.260 182938 INFO nova.virt.libvirt.driver [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Deletion of /var/lib/nova/instances/7e5ae229-b9bd-4a48-823d-148fea52e9c6_del complete
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.447 182938 DEBUG nova.compute.manager [req-95897e5c-24f6-4d84-bfe5-f8ebdbcb63c7 req-2d1af944-6086-4a78-a432-b0eb0b582c58 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Received event network-vif-unplugged-97f9f03f-2e42-4e6f-8298-2ff786f839a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.448 182938 DEBUG oslo_concurrency.lockutils [req-95897e5c-24f6-4d84-bfe5-f8ebdbcb63c7 req-2d1af944-6086-4a78-a432-b0eb0b582c58 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.448 182938 DEBUG oslo_concurrency.lockutils [req-95897e5c-24f6-4d84-bfe5-f8ebdbcb63c7 req-2d1af944-6086-4a78-a432-b0eb0b582c58 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.448 182938 DEBUG oslo_concurrency.lockutils [req-95897e5c-24f6-4d84-bfe5-f8ebdbcb63c7 req-2d1af944-6086-4a78-a432-b0eb0b582c58 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.449 182938 DEBUG nova.compute.manager [req-95897e5c-24f6-4d84-bfe5-f8ebdbcb63c7 req-2d1af944-6086-4a78-a432-b0eb0b582c58 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] No waiting events found dispatching network-vif-unplugged-97f9f03f-2e42-4e6f-8298-2ff786f839a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.449 182938 DEBUG nova.compute.manager [req-95897e5c-24f6-4d84-bfe5-f8ebdbcb63c7 req-2d1af944-6086-4a78-a432-b0eb0b582c58 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Received event network-vif-unplugged-97f9f03f-2e42-4e6f-8298-2ff786f839a4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.778 182938 INFO nova.compute.manager [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Took 1.29 seconds to destroy the instance on the hypervisor.
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.778 182938 DEBUG oslo.service.backend.eventlet.loopingcall [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.779 182938 DEBUG nova.compute.manager [-] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Jan 31 01:38:20 np0005603500 nova_compute[182934]: 2026-01-31 06:38:20.779 182938 DEBUG nova.network.neutron [-] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Jan 31 01:38:21 np0005603500 podman[214804]: 2026-01-31 06:38:21.132709102 +0000 UTC m=+0.049079319 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 31 01:38:21 np0005603500 podman[214803]: 2026-01-31 06:38:21.132872467 +0000 UTC m=+0.049924426 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.693 16 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1854 Content-Type: application/json Date: Sat, 31 Jan 2026 06:38:19 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-b78fe8bf-f124-4846-92de-a282138894de x-openstack-request-id: req-b78fe8bf-f124-4846-92de-a282138894de _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:621
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.694 16 DEBUG novaclient.v2.client [-] RESP BODY: {"server": {"id": "5a6339ba-548e-4964-9c45-15df2cf116b5", "name": "tempest-TestNetworkBasicOps-server-1600725576", "status": "ACTIVE", "tenant_id": "829310cd8381494e96216dba067ff8d3", "user_id": "dddc34b0385a49a5bd9bf081ed29e9fd", "metadata": {}, "hostId": "0c6cfdf0627941602de15b61ce73eab761f53a1d9b2a5d92c8bbcc8e", "image": {"id": "9f613975-b701-42a0-9b35-7d5c4a2cb7f2", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/images/9f613975-b701-42a0-9b35-7d5c4a2cb7f2"}]}, "flavor": {"id": "9956992e-a3ca-497f-9747-3ae270e07def", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/9956992e-a3ca-497f-9747-3ae270e07def"}]}, "created": "2026-01-31T06:36:32Z", "updated": "2026-01-31T06:36:59Z", "addresses": {"tempest-network-smoke--1051141805": [{"version": 4, "addr": "10.100.0.10", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:7a:a2:11"}]}, "accessIPv4": "", "accessIPv6": "", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/servers/5a6339ba-548e-4964-9c45-15df2cf116b5"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/servers/5a6339ba-548e-4964-9c45-15df2cf116b5"}], "OS-DCF:diskConfig": "MANUAL", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "True", "key_name": "tempest-TestNetworkBasicOps-372120458", "OS-SRV-USG:launched_at": "2026-01-31T06:36:59.000000", "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "tempest-secgroup-smoke-1998561109"}], "OS-EXT-SRV-ATTR:host": "compute-0.ctlplane.example.com", "OS-EXT-SRV-ATTR:instance_name": "instance-00000004", "OS-EXT-SRV-ATTR:hypervisor_hostname": "compute-0.ctlplane.example.com", "OS-EXT-STS:task_state": null, "OS-EXT-STS:vm_state": "active", "OS-EXT-STS:power_state": 1, "os-extended-volumes:volumes_attached": []}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:656
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.694 16 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/servers/5a6339ba-548e-4964-9c45-15df2cf116b5 used request id req-b78fe8bf-f124-4846-92de-a282138894de request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:1081
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.695 16 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5a6339ba-548e-4964-9c45-15df2cf116b5', 'name': 'tempest-TestNetworkBasicOps-server-1600725576', 'flavor': {'id': '9956992e-a3ca-497f-9747-3ae270e07def', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9f613975-b701-42a0-9b35-7d5c4a2cb7f2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000004', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '829310cd8381494e96216dba067ff8d3', 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'hostId': '0c6cfdf0627941602de15b61ce73eab761f53a1d9b2a5d92c8bbcc8e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:226
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.696 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.696 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44dd60>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.696 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44dd60>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.696 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.697 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets (2026-01-31T06:38:21.696410) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.697 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.700 16 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 5a6339ba-548e-4964-9c45-15df2cf116b5 / tap6e13d68c-b1 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.700 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/network.incoming.packets volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.701 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.701 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f199f43b550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.701 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.701 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b5e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.701 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b5e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.701 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.702 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.bytes (2026-01-31T06:38:21.701951) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.702 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.731 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/disk.device.write.bytes volume: 73125888 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.732 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.732 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.bytes in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.733 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f199f43b0d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.733 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.733 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b6d0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.733 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b6d0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.734 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.latency (2026-01-31T06:38:21.733825) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.733 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.latency heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.735 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.735 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/disk.device.read.latency volume: 593718714 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.736 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/disk.device.read.latency volume: 58124636 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.736 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.latency in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.736 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f199f44db50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.737 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.737 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44dbe0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.737 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44dbe0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.737 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.delta heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.738 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.delta (2026-01-31T06:38:21.737454) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.738 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.739 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.739 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.740 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f199f451250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.740 16 INFO ceilometer.polling.manager [-] Polling pollster power.state in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.740 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f4512e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.740 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f4512e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.740 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: power.state heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.741 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for power.state (2026-01-31T06:38:21.740926) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.742 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.759 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/power.state volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.760 16 INFO ceilometer.polling.manager [-] Finished polling pollster power.state in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.761 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f199f43baf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.761 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.761 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b400>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.762 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b400>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.762 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.requests heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.762 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.requests (2026-01-31T06:38:21.762254) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.764 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.764 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/disk.device.write.requests volume: 338 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.765 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.765 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.requests in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.766 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f199f44d160>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.766 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.766 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d0a0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.767 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d0a0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.767 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.error heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.767 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.error (2026-01-31T06:38:21.767374) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.768 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.769 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.770 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.error in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.770 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f199f43bdf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.770 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.771 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43be80>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.771 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43be80>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.771 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.latency (2026-01-31T06:38:21.771626) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.771 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.latency heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.773 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.773 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/disk.device.write.latency volume: 2572851580 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.774 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.774 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.latency in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.774 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f199f43bbe0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.774 16 INFO ceilometer.polling.manager [-] Polling pollster disk.ephemeral.size in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.774 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b2e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.775 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b2e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.775 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.ephemeral.size (2026-01-31T06:38:21.775213) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.775 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.ephemeral.size heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.776 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.ephemeral.size in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.776 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f199f44d760>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.776 16 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.776 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d400>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.776 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d400>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.776 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: memory.usage heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.776 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for memory.usage (2026-01-31T06:38:21.776756) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.777 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.777 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/memory.usage volume: 46.97265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.777 16 INFO ceilometer.polling.manager [-] Finished polling pollster memory.usage in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.778 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f199f44d2e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.778 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.778 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d3a0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.778 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d3a0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.778 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.drop heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.778 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.drop (2026-01-31T06:38:21.778412) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.779 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.779 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.779 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.779 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f199f43b490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.779 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.780 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b520>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.780 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b520>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.780 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.usage heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.780 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.usage (2026-01-31T06:38:21.780232) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.781 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.791 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.791 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/disk.device.usage volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.792 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.usage in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.792 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f199f44d940>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.792 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.792 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d1f0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.792 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d1f0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.792 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.delta heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.793 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.delta (2026-01-31T06:38:21.792750) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.793 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.794 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.794 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.794 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f199f44d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.794 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.794 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d7f0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.794 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d7f0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.795 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.795 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets (2026-01-31T06:38:21.795000) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.795 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.795 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/network.outgoing.packets volume: 111 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.796 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.796 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f199f44d3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.796 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.796 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d5e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.796 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d5e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.796 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes (2026-01-31T06:38:21.796777) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.796 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.797 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.797 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/network.incoming.bytes volume: 19926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.797 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.798 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f199f43bca0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.798 16 INFO ceilometer.polling.manager [-] Polling pollster disk.root.size in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.798 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43bbb0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.798 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43bbb0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.798 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.root.size heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.798 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.root.size (2026-01-31T06:38:21.798698) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.799 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.root.size in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.799 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f199f436bb0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.799 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.799 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43ba60>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.799 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43ba60>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.800 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.allocation heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.800 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.allocation (2026-01-31T06:38:21.800041) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.800 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.800 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.801 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/disk.device.allocation volume: 499712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.801 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.allocation in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.801 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f199f44d4f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.801 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.801 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d580>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.801 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d580>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.802 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.error heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.802 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.error (2026-01-31T06:38:21.802070) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.802 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.802 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.803 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.error in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.803 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f199f43b340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.803 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.803 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b9d0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.803 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b9d0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.803 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.capacity heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.804 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.capacity (2026-01-31T06:38:21.803854) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.804 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.804 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.804 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/disk.device.capacity volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.805 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.capacity in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.805 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f199f44d6a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.805 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.805 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d850>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.805 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d850>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.805 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.rate heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.806 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.rate (2026-01-31T06:38:21.805954) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.806 16 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.806 16 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-53419157>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1600725576>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-53419157>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1600725576>]
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.806 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f199f43b700>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.806 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.806 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b610>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.807 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b610>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.807 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.807 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.bytes (2026-01-31T06:38:21.807158) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.807 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.807 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/disk.device.read.bytes volume: 30575104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.808 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/disk.device.read.bytes volume: 284990 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.808 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.bytes in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.808 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f19a53f3b80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.808 16 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.808 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f436ee0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.809 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f436ee0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.809 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: cpu heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.809 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for cpu (2026-01-31T06:38:21.809172) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.809 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.810 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/cpu volume: 10880000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.810 16 INFO ceilometer.polling.manager [-] Finished polling pollster cpu in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.810 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f199f44dc10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.810 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.810 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44dca0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.810 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44dca0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.811 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.rate (2026-01-31T06:38:21.810947) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.810 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.rate heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.811 16 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.811 16 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-53419157>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1600725576>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-53419157>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1600725576>]
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.811 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f199f43b3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.811 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.811 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b460>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.812 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b460>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.812 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.requests heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.812 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.requests (2026-01-31T06:38:21.812115) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.812 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.812 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/disk.device.read.requests volume: 1109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.813 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/disk.device.read.requests volume: 113 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.813 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.requests in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.813 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f199f44d220>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.813 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.813 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d130>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.814 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d130>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.814 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.814 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes (2026-01-31T06:38:21.814155) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.814 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.815 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/network.outgoing.bytes volume: 16166 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.815 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.815 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f199f44d040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.815 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.815 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44de20>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.815 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44de20>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.816 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.drop heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.816 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.drop (2026-01-31T06:38:21.816060) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6'
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.816 16 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-00000005, id=7e5ae229-b9bd-4a48-823d-148fea52e9c6>: [Error Code 42] Domain not found: no domain with matching uuid '7e5ae229-b9bd-4a48-823d-148fea52e9c6' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.816 16 DEBUG ceilometer.compute.pollsters [-] 5a6339ba-548e-4964-9c45-15df2cf116b5/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:38:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:38:21.817 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.drop in the context of pollsters
Jan 31 01:38:22 np0005603500 nova_compute[182934]: 2026-01-31 06:38:22.734 182938 DEBUG nova.compute.manager [req-1105892d-8c0c-4568-b106-838585e69cac req-a864169e-fdfb-4742-abee-c1848f824220 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Received event network-vif-plugged-97f9f03f-2e42-4e6f-8298-2ff786f839a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:38:22 np0005603500 nova_compute[182934]: 2026-01-31 06:38:22.735 182938 DEBUG oslo_concurrency.lockutils [req-1105892d-8c0c-4568-b106-838585e69cac req-a864169e-fdfb-4742-abee-c1848f824220 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:38:22 np0005603500 nova_compute[182934]: 2026-01-31 06:38:22.735 182938 DEBUG oslo_concurrency.lockutils [req-1105892d-8c0c-4568-b106-838585e69cac req-a864169e-fdfb-4742-abee-c1848f824220 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:38:22 np0005603500 nova_compute[182934]: 2026-01-31 06:38:22.735 182938 DEBUG oslo_concurrency.lockutils [req-1105892d-8c0c-4568-b106-838585e69cac req-a864169e-fdfb-4742-abee-c1848f824220 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:38:22 np0005603500 nova_compute[182934]: 2026-01-31 06:38:22.735 182938 DEBUG nova.compute.manager [req-1105892d-8c0c-4568-b106-838585e69cac req-a864169e-fdfb-4742-abee-c1848f824220 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] No waiting events found dispatching network-vif-plugged-97f9f03f-2e42-4e6f-8298-2ff786f839a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:38:22 np0005603500 nova_compute[182934]: 2026-01-31 06:38:22.736 182938 WARNING nova.compute.manager [req-1105892d-8c0c-4568-b106-838585e69cac req-a864169e-fdfb-4742-abee-c1848f824220 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Received unexpected event network-vif-plugged-97f9f03f-2e42-4e6f-8298-2ff786f839a4 for instance with vm_state active and task_state deleting.
Jan 31 01:38:22 np0005603500 nova_compute[182934]: 2026-01-31 06:38:22.736 182938 DEBUG nova.compute.manager [req-1105892d-8c0c-4568-b106-838585e69cac req-a864169e-fdfb-4742-abee-c1848f824220 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Received event network-vif-deleted-97f9f03f-2e42-4e6f-8298-2ff786f839a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:38:22 np0005603500 nova_compute[182934]: 2026-01-31 06:38:22.736 182938 INFO nova.compute.manager [req-1105892d-8c0c-4568-b106-838585e69cac req-a864169e-fdfb-4742-abee-c1848f824220 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Neutron deleted interface 97f9f03f-2e42-4e6f-8298-2ff786f839a4; detaching it from the instance and deleting it from the info cache
Jan 31 01:38:22 np0005603500 nova_compute[182934]: 2026-01-31 06:38:22.736 182938 DEBUG nova.network.neutron [req-1105892d-8c0c-4568-b106-838585e69cac req-a864169e-fdfb-4742-abee-c1848f824220 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:38:22 np0005603500 nova_compute[182934]: 2026-01-31 06:38:22.849 182938 DEBUG nova.network.neutron [-] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:38:23 np0005603500 nova_compute[182934]: 2026-01-31 06:38:23.244 182938 DEBUG nova.compute.manager [req-1105892d-8c0c-4568-b106-838585e69cac req-a864169e-fdfb-4742-abee-c1848f824220 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Detach interface failed, port_id=97f9f03f-2e42-4e6f-8298-2ff786f839a4, reason: Instance 7e5ae229-b9bd-4a48-823d-148fea52e9c6 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11571
Jan 31 01:38:23 np0005603500 nova_compute[182934]: 2026-01-31 06:38:23.355 182938 INFO nova.compute.manager [-] [instance: 7e5ae229-b9bd-4a48-823d-148fea52e9c6] Took 2.58 seconds to deallocate network for instance.
Jan 31 01:38:23 np0005603500 nova_compute[182934]: 2026-01-31 06:38:23.416 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:23 np0005603500 nova_compute[182934]: 2026-01-31 06:38:23.866 182938 DEBUG oslo_concurrency.lockutils [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:38:23 np0005603500 nova_compute[182934]: 2026-01-31 06:38:23.866 182938 DEBUG oslo_concurrency.lockutils [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:38:23 np0005603500 nova_compute[182934]: 2026-01-31 06:38:23.951 182938 DEBUG nova.compute.provider_tree [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:38:24 np0005603500 nova_compute[182934]: 2026-01-31 06:38:24.466 182938 DEBUG nova.scheduler.client.report [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:38:24 np0005603500 nova_compute[182934]: 2026-01-31 06:38:24.978 182938 DEBUG oslo_concurrency.lockutils [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:38:25 np0005603500 nova_compute[182934]: 2026-01-31 06:38:25.012 182938 INFO nova.scheduler.client.report [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Deleted allocations for instance 7e5ae229-b9bd-4a48-823d-148fea52e9c6
Jan 31 01:38:25 np0005603500 nova_compute[182934]: 2026-01-31 06:38:25.255 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:26 np0005603500 nova_compute[182934]: 2026-01-31 06:38:26.037 182938 DEBUG oslo_concurrency.lockutils [None req-527fda4c-5733-4e70-97bf-a018670bc992 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "7e5ae229-b9bd-4a48-823d-148fea52e9c6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.067s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:38:28 np0005603500 nova_compute[182934]: 2026-01-31 06:38:28.418 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:29 np0005603500 podman[214844]: 2026-01-31 06:38:29.163657901 +0000 UTC m=+0.081895448 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, release=1769056855, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, architecture=x86_64, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 31 01:38:29 np0005603500 nova_compute[182934]: 2026-01-31 06:38:29.250 182938 DEBUG oslo_concurrency.lockutils [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "5a6339ba-548e-4964-9c45-15df2cf116b5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:38:29 np0005603500 nova_compute[182934]: 2026-01-31 06:38:29.250 182938 DEBUG oslo_concurrency.lockutils [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "5a6339ba-548e-4964-9c45-15df2cf116b5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:38:29 np0005603500 nova_compute[182934]: 2026-01-31 06:38:29.250 182938 DEBUG oslo_concurrency.lockutils [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "5a6339ba-548e-4964-9c45-15df2cf116b5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:38:29 np0005603500 nova_compute[182934]: 2026-01-31 06:38:29.251 182938 DEBUG oslo_concurrency.lockutils [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "5a6339ba-548e-4964-9c45-15df2cf116b5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:38:29 np0005603500 nova_compute[182934]: 2026-01-31 06:38:29.251 182938 DEBUG oslo_concurrency.lockutils [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "5a6339ba-548e-4964-9c45-15df2cf116b5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:38:29 np0005603500 nova_compute[182934]: 2026-01-31 06:38:29.252 182938 INFO nova.compute.manager [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Terminating instance
Jan 31 01:38:29 np0005603500 nova_compute[182934]: 2026-01-31 06:38:29.759 182938 DEBUG nova.compute.manager [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Jan 31 01:38:29 np0005603500 kernel: tap6e13d68c-b1 (unregistering): left promiscuous mode
Jan 31 01:38:29 np0005603500 NetworkManager[55506]: <info>  [1769841509.8596] device (tap6e13d68c-b1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 01:38:29 np0005603500 ovn_controller[95398]: 2026-01-31T06:38:29Z|00095|binding|INFO|Releasing lport 6e13d68c-b169-42b0-bafe-2026dd7b7c9f from this chassis (sb_readonly=0)
Jan 31 01:38:29 np0005603500 ovn_controller[95398]: 2026-01-31T06:38:29Z|00096|binding|INFO|Setting lport 6e13d68c-b169-42b0-bafe-2026dd7b7c9f down in Southbound
Jan 31 01:38:29 np0005603500 nova_compute[182934]: 2026-01-31 06:38:29.862 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:29 np0005603500 ovn_controller[95398]: 2026-01-31T06:38:29Z|00097|binding|INFO|Removing iface tap6e13d68c-b1 ovn-installed in OVS
Jan 31 01:38:29 np0005603500 nova_compute[182934]: 2026-01-31 06:38:29.864 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:29 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:29.870 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:a2:11 10.100.0.10'], port_security=['fa:16:3e:7a:a2:11 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '5a6339ba-548e-4964-9c45-15df2cf116b5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bb2649b-6b65-4e17-a6b3-abb539667aef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'd0f1208b-2625-4bf0-aaaf-6a0e9a24c37e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29d6cd86-2fe1-46f4-ab6e-26f3373754c7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=6e13d68c-b169-42b0-bafe-2026dd7b7c9f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:38:29 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:29.872 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 6e13d68c-b169-42b0-bafe-2026dd7b7c9f in datapath 5bb2649b-6b65-4e17-a6b3-abb539667aef unbound from our chassis
Jan 31 01:38:29 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:29.873 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5bb2649b-6b65-4e17-a6b3-abb539667aef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:38:29 np0005603500 nova_compute[182934]: 2026-01-31 06:38:29.873 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:29 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:29.874 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[14bb3d2f-b2fa-4538-a44f-09ad3f77fd20]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:38:29 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:29.874 104644 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef namespace which is not needed anymore
Jan 31 01:38:29 np0005603500 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Deactivated successfully.
Jan 31 01:38:29 np0005603500 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Consumed 14.650s CPU time.
Jan 31 01:38:29 np0005603500 systemd-machined[154375]: Machine qemu-4-instance-00000004 terminated.
Jan 31 01:38:29 np0005603500 neutron-haproxy-ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef[214308]: [NOTICE]   (214326) : haproxy version is 2.8.14-c23fe91
Jan 31 01:38:29 np0005603500 neutron-haproxy-ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef[214308]: [NOTICE]   (214326) : path to executable is /usr/sbin/haproxy
Jan 31 01:38:29 np0005603500 neutron-haproxy-ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef[214308]: [WARNING]  (214326) : Exiting Master process...
Jan 31 01:38:29 np0005603500 podman[214890]: 2026-01-31 06:38:29.982991671 +0000 UTC m=+0.028258927 container kill 92f1d321f07ee99d301cf34382571b4c6dc861546b24d31d97e895a579c95f6e (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef, tcib_managed=true, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 01:38:29 np0005603500 neutron-haproxy-ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef[214308]: [ALERT]    (214326) : Current worker (214334) exited with code 143 (Terminated)
Jan 31 01:38:29 np0005603500 neutron-haproxy-ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef[214308]: [WARNING]  (214326) : All workers exited. Exiting... (0)
Jan 31 01:38:29 np0005603500 systemd[1]: libpod-92f1d321f07ee99d301cf34382571b4c6dc861546b24d31d97e895a579c95f6e.scope: Deactivated successfully.
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.006 182938 INFO nova.virt.libvirt.driver [-] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Instance destroyed successfully.
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.007 182938 DEBUG nova.objects.instance [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'resources' on Instance uuid 5a6339ba-548e-4964-9c45-15df2cf116b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:38:30 np0005603500 podman[214917]: 2026-01-31 06:38:30.057350572 +0000 UTC m=+0.058738493 container died 92f1d321f07ee99d301cf34382571b4c6dc861546b24d31d97e895a579c95f6e (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.258 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.287 182938 DEBUG nova.compute.manager [req-6ce3f49e-e80d-431c-9c60-18b084e34f49 req-49f4191c-62de-425c-aeaf-83dc59d8439d 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Received event network-vif-unplugged-6e13d68c-b169-42b0-bafe-2026dd7b7c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.287 182938 DEBUG oslo_concurrency.lockutils [req-6ce3f49e-e80d-431c-9c60-18b084e34f49 req-49f4191c-62de-425c-aeaf-83dc59d8439d 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "5a6339ba-548e-4964-9c45-15df2cf116b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.287 182938 DEBUG oslo_concurrency.lockutils [req-6ce3f49e-e80d-431c-9c60-18b084e34f49 req-49f4191c-62de-425c-aeaf-83dc59d8439d 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5a6339ba-548e-4964-9c45-15df2cf116b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.288 182938 DEBUG oslo_concurrency.lockutils [req-6ce3f49e-e80d-431c-9c60-18b084e34f49 req-49f4191c-62de-425c-aeaf-83dc59d8439d 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5a6339ba-548e-4964-9c45-15df2cf116b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.288 182938 DEBUG nova.compute.manager [req-6ce3f49e-e80d-431c-9c60-18b084e34f49 req-49f4191c-62de-425c-aeaf-83dc59d8439d 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] No waiting events found dispatching network-vif-unplugged-6e13d68c-b169-42b0-bafe-2026dd7b7c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.288 182938 DEBUG nova.compute.manager [req-6ce3f49e-e80d-431c-9c60-18b084e34f49 req-49f4191c-62de-425c-aeaf-83dc59d8439d 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Received event network-vif-unplugged-6e13d68c-b169-42b0-bafe-2026dd7b7c9f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Jan 31 01:38:30 np0005603500 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-92f1d321f07ee99d301cf34382571b4c6dc861546b24d31d97e895a579c95f6e-userdata-shm.mount: Deactivated successfully.
Jan 31 01:38:30 np0005603500 systemd[1]: var-lib-containers-storage-overlay-2ef85706fed5713802142090825c7e461c05f589506a56b1d3c2329cd6873547-merged.mount: Deactivated successfully.
Jan 31 01:38:30 np0005603500 podman[214947]: 2026-01-31 06:38:30.495386056 +0000 UTC m=+0.062090178 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.517 182938 DEBUG nova.virt.libvirt.vif [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:36:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1600725576',display_name='tempest-TestNetworkBasicOps-server-1600725576',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1600725576',id=4,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBVf4DFKNCoqVbCeDiHmpnUJgB+AhhR5LyMEr+wH5zX90PeFXhDIlV+tsioKRgxnkIW5+o7qlCZ0JP/wGThi+7YhSDHjcF3W9ZrpMft2psVh5chsymmjI3kuU5RAc1Dd6Q==',key_name='tempest-TestNetworkBasicOps-372120458',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:36:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-23axq0bw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:36:59Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=5a6339ba-548e-4964-9c45-15df2cf116b5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "address": "fa:16:3e:7a:a2:11", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e13d68c-b1", "ovs_interfaceid": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.518 182938 DEBUG nova.network.os_vif_util [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "address": "fa:16:3e:7a:a2:11", "network": {"id": "5bb2649b-6b65-4e17-a6b3-abb539667aef", "bridge": "br-int", "label": "tempest-network-smoke--1051141805", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e13d68c-b1", "ovs_interfaceid": "6e13d68c-b169-42b0-bafe-2026dd7b7c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.518 182938 DEBUG nova.network.os_vif_util [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7a:a2:11,bridge_name='br-int',has_traffic_filtering=True,id=6e13d68c-b169-42b0-bafe-2026dd7b7c9f,network=Network(5bb2649b-6b65-4e17-a6b3-abb539667aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e13d68c-b1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.518 182938 DEBUG os_vif [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7a:a2:11,bridge_name='br-int',has_traffic_filtering=True,id=6e13d68c-b169-42b0-bafe-2026dd7b7c9f,network=Network(5bb2649b-6b65-4e17-a6b3-abb539667aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e13d68c-b1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.520 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.520 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e13d68c-b1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.522 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.525 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.525 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.526 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=51703e61-10f0-4814-8b7c-5c7e0c46a554) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.526 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.529 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.530 182938 INFO os_vif [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7a:a2:11,bridge_name='br-int',has_traffic_filtering=True,id=6e13d68c-b169-42b0-bafe-2026dd7b7c9f,network=Network(5bb2649b-6b65-4e17-a6b3-abb539667aef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e13d68c-b1')
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.531 182938 INFO nova.virt.libvirt.driver [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Deleting instance files /var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5_del
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.532 182938 INFO nova.virt.libvirt.driver [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Deletion of /var/lib/nova/instances/5a6339ba-548e-4964-9c45-15df2cf116b5_del complete
Jan 31 01:38:30 np0005603500 podman[214917]: 2026-01-31 06:38:30.553666103 +0000 UTC m=+0.555054004 container cleanup 92f1d321f07ee99d301cf34382571b4c6dc861546b24d31d97e895a579c95f6e (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127)
Jan 31 01:38:30 np0005603500 systemd[1]: libpod-conmon-92f1d321f07ee99d301cf34382571b4c6dc861546b24d31d97e895a579c95f6e.scope: Deactivated successfully.
Jan 31 01:38:30 np0005603500 podman[214933]: 2026-01-31 06:38:30.577725437 +0000 UTC m=+0.517029071 container remove 92f1d321f07ee99d301cf34382571b4c6dc861546b24d31d97e895a579c95f6e (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 31 01:38:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:30.581 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[a4a2bc84-d616-4d5e-84d1-267b109269e8]: (4, ("Sat Jan 31 06:38:29 AM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef (92f1d321f07ee99d301cf34382571b4c6dc861546b24d31d97e895a579c95f6e)\n92f1d321f07ee99d301cf34382571b4c6dc861546b24d31d97e895a579c95f6e\nSat Jan 31 06:38:30 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef (92f1d321f07ee99d301cf34382571b4c6dc861546b24d31d97e895a579c95f6e)\n92f1d321f07ee99d301cf34382571b4c6dc861546b24d31d97e895a579c95f6e\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:38:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:30.584 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[b354a5fb-bc3f-4969-be97-cfcf5128ff15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:38:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:30.585 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5bb2649b-6b65-4e17-a6b3-abb539667aef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5bb2649b-6b65-4e17-a6b3-abb539667aef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:38:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:30.586 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[3d4357cc-e121-455c-8a3a-06a3edeac5db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:38:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:30.587 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5bb2649b-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.588 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:30 np0005603500 kernel: tap5bb2649b-60: left promiscuous mode
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.594 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:30 np0005603500 nova_compute[182934]: 2026-01-31 06:38:30.595 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:30.597 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[6be9b082-3ccc-4e2d-b4fc-49d8283f927a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:38:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:30.611 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[8d46c646-1286-4ae9-8893-0c7e4a126dbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:38:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:30.612 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[6c62e074-b230-486b-a7ea-9e1d8ea1af8c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:38:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:30.625 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[946ec36d-4df3-46f0-a948-2fb70842f632]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372919, 'reachable_time': 21411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214975, 'error': None, 'target': 'ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:38:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:30.627 105168 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5bb2649b-6b65-4e17-a6b3-abb539667aef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 31 01:38:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:30.628 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[dd24917f-6d6a-447d-9d6f-b87e6e2b4c9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:38:30 np0005603500 systemd[1]: run-netns-ovnmeta\x2d5bb2649b\x2d6b65\x2d4e17\x2da6b3\x2dabb539667aef.mount: Deactivated successfully.
Jan 31 01:38:31 np0005603500 nova_compute[182934]: 2026-01-31 06:38:31.044 182938 INFO nova.compute.manager [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Took 1.28 seconds to destroy the instance on the hypervisor.
Jan 31 01:38:31 np0005603500 nova_compute[182934]: 2026-01-31 06:38:31.044 182938 DEBUG oslo.service.backend.eventlet.loopingcall [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Jan 31 01:38:31 np0005603500 nova_compute[182934]: 2026-01-31 06:38:31.044 182938 DEBUG nova.compute.manager [-] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Jan 31 01:38:31 np0005603500 nova_compute[182934]: 2026-01-31 06:38:31.045 182938 DEBUG nova.network.neutron [-] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Jan 31 01:38:31 np0005603500 nova_compute[182934]: 2026-01-31 06:38:31.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:38:31 np0005603500 nova_compute[182934]: 2026-01-31 06:38:31.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:38:31 np0005603500 nova_compute[182934]: 2026-01-31 06:38:31.666 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:38:31 np0005603500 nova_compute[182934]: 2026-01-31 06:38:31.667 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:38:31 np0005603500 nova_compute[182934]: 2026-01-31 06:38:31.667 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:38:31 np0005603500 nova_compute[182934]: 2026-01-31 06:38:31.667 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:38:31 np0005603500 nova_compute[182934]: 2026-01-31 06:38:31.784 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:38:31 np0005603500 nova_compute[182934]: 2026-01-31 06:38:31.785 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5763MB free_disk=73.21575927734375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:38:31 np0005603500 nova_compute[182934]: 2026-01-31 06:38:31.785 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:38:31 np0005603500 nova_compute[182934]: 2026-01-31 06:38:31.785 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:38:32 np0005603500 nova_compute[182934]: 2026-01-31 06:38:32.514 182938 DEBUG nova.compute.manager [req-ed8481c2-c183-4434-a5b7-fc74fc24ec06 req-83abb490-376a-4f63-8f67-aa2be69c7691 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Received event network-vif-plugged-6e13d68c-b169-42b0-bafe-2026dd7b7c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:38:32 np0005603500 nova_compute[182934]: 2026-01-31 06:38:32.514 182938 DEBUG oslo_concurrency.lockutils [req-ed8481c2-c183-4434-a5b7-fc74fc24ec06 req-83abb490-376a-4f63-8f67-aa2be69c7691 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "5a6339ba-548e-4964-9c45-15df2cf116b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:38:32 np0005603500 nova_compute[182934]: 2026-01-31 06:38:32.515 182938 DEBUG oslo_concurrency.lockutils [req-ed8481c2-c183-4434-a5b7-fc74fc24ec06 req-83abb490-376a-4f63-8f67-aa2be69c7691 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5a6339ba-548e-4964-9c45-15df2cf116b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:38:32 np0005603500 nova_compute[182934]: 2026-01-31 06:38:32.515 182938 DEBUG oslo_concurrency.lockutils [req-ed8481c2-c183-4434-a5b7-fc74fc24ec06 req-83abb490-376a-4f63-8f67-aa2be69c7691 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5a6339ba-548e-4964-9c45-15df2cf116b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:38:32 np0005603500 nova_compute[182934]: 2026-01-31 06:38:32.515 182938 DEBUG nova.compute.manager [req-ed8481c2-c183-4434-a5b7-fc74fc24ec06 req-83abb490-376a-4f63-8f67-aa2be69c7691 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] No waiting events found dispatching network-vif-plugged-6e13d68c-b169-42b0-bafe-2026dd7b7c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:38:32 np0005603500 nova_compute[182934]: 2026-01-31 06:38:32.515 182938 WARNING nova.compute.manager [req-ed8481c2-c183-4434-a5b7-fc74fc24ec06 req-83abb490-376a-4f63-8f67-aa2be69c7691 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Received unexpected event network-vif-plugged-6e13d68c-b169-42b0-bafe-2026dd7b7c9f for instance with vm_state active and task_state deleting.
Jan 31 01:38:32 np0005603500 nova_compute[182934]: 2026-01-31 06:38:32.830 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Instance 5a6339ba-548e-4964-9c45-15df2cf116b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Jan 31 01:38:32 np0005603500 nova_compute[182934]: 2026-01-31 06:38:32.831 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:38:32 np0005603500 nova_compute[182934]: 2026-01-31 06:38:32.831 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:38:32 np0005603500 nova_compute[182934]: 2026-01-31 06:38:32.879 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:38:33 np0005603500 nova_compute[182934]: 2026-01-31 06:38:33.420 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:33 np0005603500 nova_compute[182934]: 2026-01-31 06:38:33.442 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:38:33 np0005603500 nova_compute[182934]: 2026-01-31 06:38:33.817 182938 DEBUG nova.compute.manager [req-a5bc3d5c-ecc6-412a-a4b1-9716c8b4026a req-564c2c4f-e457-4348-b074-4c20a387cf3d 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Received event network-vif-deleted-6e13d68c-b169-42b0-bafe-2026dd7b7c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:38:33 np0005603500 nova_compute[182934]: 2026-01-31 06:38:33.817 182938 INFO nova.compute.manager [req-a5bc3d5c-ecc6-412a-a4b1-9716c8b4026a req-564c2c4f-e457-4348-b074-4c20a387cf3d 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Neutron deleted interface 6e13d68c-b169-42b0-bafe-2026dd7b7c9f; detaching it from the instance and deleting it from the info cache
Jan 31 01:38:33 np0005603500 nova_compute[182934]: 2026-01-31 06:38:33.818 182938 DEBUG nova.network.neutron [req-a5bc3d5c-ecc6-412a-a4b1-9716c8b4026a req-564c2c4f-e457-4348-b074-4c20a387cf3d 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:38:33 np0005603500 nova_compute[182934]: 2026-01-31 06:38:33.964 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:38:33 np0005603500 nova_compute[182934]: 2026-01-31 06:38:33.964 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:38:34 np0005603500 nova_compute[182934]: 2026-01-31 06:38:34.085 182938 DEBUG nova.network.neutron [-] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:38:34 np0005603500 nova_compute[182934]: 2026-01-31 06:38:34.327 182938 DEBUG nova.compute.manager [req-a5bc3d5c-ecc6-412a-a4b1-9716c8b4026a req-564c2c4f-e457-4348-b074-4c20a387cf3d 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Detach interface failed, port_id=6e13d68c-b169-42b0-bafe-2026dd7b7c9f, reason: Instance 5a6339ba-548e-4964-9c45-15df2cf116b5 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11571
Jan 31 01:38:34 np0005603500 nova_compute[182934]: 2026-01-31 06:38:34.596 182938 INFO nova.compute.manager [-] [instance: 5a6339ba-548e-4964-9c45-15df2cf116b5] Took 3.55 seconds to deallocate network for instance.
Jan 31 01:38:34 np0005603500 nova_compute[182934]: 2026-01-31 06:38:34.964 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:38:34 np0005603500 nova_compute[182934]: 2026-01-31 06:38:34.965 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:38:34 np0005603500 nova_compute[182934]: 2026-01-31 06:38:34.965 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:38:34 np0005603500 nova_compute[182934]: 2026-01-31 06:38:34.965 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:38:34 np0005603500 nova_compute[182934]: 2026-01-31 06:38:34.966 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:38:35 np0005603500 nova_compute[182934]: 2026-01-31 06:38:35.107 182938 DEBUG oslo_concurrency.lockutils [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:38:35 np0005603500 nova_compute[182934]: 2026-01-31 06:38:35.108 182938 DEBUG oslo_concurrency.lockutils [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:38:35 np0005603500 nova_compute[182934]: 2026-01-31 06:38:35.150 182938 DEBUG nova.compute.provider_tree [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:38:35 np0005603500 nova_compute[182934]: 2026-01-31 06:38:35.527 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:35 np0005603500 nova_compute[182934]: 2026-01-31 06:38:35.657 182938 DEBUG nova.scheduler.client.report [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:38:36 np0005603500 nova_compute[182934]: 2026-01-31 06:38:36.144 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:38:36 np0005603500 nova_compute[182934]: 2026-01-31 06:38:36.165 182938 DEBUG oslo_concurrency.lockutils [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.058s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:38:36 np0005603500 nova_compute[182934]: 2026-01-31 06:38:36.190 182938 INFO nova.scheduler.client.report [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Deleted allocations for instance 5a6339ba-548e-4964-9c45-15df2cf116b5
Jan 31 01:38:37 np0005603500 podman[214978]: 2026-01-31 06:38:37.137165372 +0000 UTC m=+0.054924240 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 31 01:38:37 np0005603500 nova_compute[182934]: 2026-01-31 06:38:37.221 182938 DEBUG oslo_concurrency.lockutils [None req-a09ac1eb-c18f-48e6-8401-a85e7b910a4b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "5a6339ba-548e-4964-9c45-15df2cf116b5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.971s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:38:38 np0005603500 nova_compute[182934]: 2026-01-31 06:38:38.421 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:39 np0005603500 podman[214997]: 2026-01-31 06:38:39.126916673 +0000 UTC m=+0.047028529 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:38:40 np0005603500 nova_compute[182934]: 2026-01-31 06:38:40.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:38:40 np0005603500 nova_compute[182934]: 2026-01-31 06:38:40.529 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:41 np0005603500 nova_compute[182934]: 2026-01-31 06:38:41.537 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:41 np0005603500 nova_compute[182934]: 2026-01-31 06:38:41.555 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:43 np0005603500 nova_compute[182934]: 2026-01-31 06:38:43.423 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:45 np0005603500 nova_compute[182934]: 2026-01-31 06:38:45.530 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:48 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:48.400 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:73:85 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-719b4b2c-8a13-4225-a6bd-071da9c7ca99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ef63d3b7-bb4b-4593-95e1-75f2cdd31d5e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2f0d159b-8dc8-4f02-92d2-05f0b9cc7315) old=Port_Binding(mac=['fa:16:3e:f8:73:85'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-719b4b2c-8a13-4225-a6bd-071da9c7ca99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:38:48 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:48.402 104644 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2f0d159b-8dc8-4f02-92d2-05f0b9cc7315 in datapath 719b4b2c-8a13-4225-a6bd-071da9c7ca99 updated
Jan 31 01:38:48 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:48.403 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 719b4b2c-8a13-4225-a6bd-071da9c7ca99, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:38:48 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:48.404 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[10daa940-ad39-4d8b-8189-3dd94487661d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:38:48 np0005603500 nova_compute[182934]: 2026-01-31 06:38:48.426 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:50 np0005603500 nova_compute[182934]: 2026-01-31 06:38:50.533 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:52 np0005603500 podman[215023]: 2026-01-31 06:38:52.126056486 +0000 UTC m=+0.045748096 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 31 01:38:52 np0005603500 podman[215024]: 2026-01-31 06:38:52.13214654 +0000 UTC m=+0.049323940 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 01:38:53 np0005603500 nova_compute[182934]: 2026-01-31 06:38:53.430 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:55 np0005603500 nova_compute[182934]: 2026-01-31 06:38:55.614 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:38:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:55.901 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:38:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:55.902 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:38:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:38:55.902 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:38:58 np0005603500 nova_compute[182934]: 2026-01-31 06:38:58.430 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:00 np0005603500 podman[215067]: 2026-01-31 06:39:00.137405521 +0000 UTC m=+0.057245764 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1769056855, vcs-type=git, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 31 01:39:00 np0005603500 nova_compute[182934]: 2026-01-31 06:39:00.616 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:01 np0005603500 podman[215089]: 2026-01-31 06:39:01.151325516 +0000 UTC m=+0.071531938 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:39:03 np0005603500 nova_compute[182934]: 2026-01-31 06:39:03.434 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:05 np0005603500 nova_compute[182934]: 2026-01-31 06:39:05.618 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:07 np0005603500 nova_compute[182934]: 2026-01-31 06:39:07.903 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:39:07 np0005603500 nova_compute[182934]: 2026-01-31 06:39:07.903 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:39:08 np0005603500 podman[215115]: 2026-01-31 06:39:08.128054326 +0000 UTC m=+0.047014427 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 01:39:08 np0005603500 nova_compute[182934]: 2026-01-31 06:39:08.408 182938 DEBUG nova.compute.manager [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Jan 31 01:39:08 np0005603500 nova_compute[182934]: 2026-01-31 06:39:08.435 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:08 np0005603500 nova_compute[182934]: 2026-01-31 06:39:08.945 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:39:08 np0005603500 nova_compute[182934]: 2026-01-31 06:39:08.945 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:39:08 np0005603500 nova_compute[182934]: 2026-01-31 06:39:08.953 182938 DEBUG nova.virt.hardware [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Jan 31 01:39:08 np0005603500 nova_compute[182934]: 2026-01-31 06:39:08.953 182938 INFO nova.compute.claims [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Claim successful on node compute-0.ctlplane.example.com
Jan 31 01:39:10 np0005603500 nova_compute[182934]: 2026-01-31 06:39:10.006 182938 DEBUG nova.compute.provider_tree [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:39:10 np0005603500 podman[215136]: 2026-01-31 06:39:10.119131569 +0000 UTC m=+0.034431737 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 31 01:39:10 np0005603500 nova_compute[182934]: 2026-01-31 06:39:10.513 182938 DEBUG nova.scheduler.client.report [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:39:10 np0005603500 nova_compute[182934]: 2026-01-31 06:39:10.620 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:11 np0005603500 nova_compute[182934]: 2026-01-31 06:39:11.023 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:39:11 np0005603500 nova_compute[182934]: 2026-01-31 06:39:11.024 182938 DEBUG nova.compute.manager [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Jan 31 01:39:11 np0005603500 nova_compute[182934]: 2026-01-31 06:39:11.537 182938 DEBUG nova.compute.manager [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Jan 31 01:39:11 np0005603500 nova_compute[182934]: 2026-01-31 06:39:11.538 182938 DEBUG nova.network.neutron [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Jan 31 01:39:11 np0005603500 nova_compute[182934]: 2026-01-31 06:39:11.962 182938 DEBUG nova.policy [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '829310cd8381494e96216dba067ff8d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Jan 31 01:39:12 np0005603500 nova_compute[182934]: 2026-01-31 06:39:12.045 182938 INFO nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 31 01:39:12 np0005603500 nova_compute[182934]: 2026-01-31 06:39:12.553 182938 DEBUG nova.compute.manager [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Jan 31 01:39:13 np0005603500 nova_compute[182934]: 2026-01-31 06:39:13.475 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:13 np0005603500 nova_compute[182934]: 2026-01-31 06:39:13.572 182938 DEBUG nova.compute.manager [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Jan 31 01:39:13 np0005603500 nova_compute[182934]: 2026-01-31 06:39:13.573 182938 DEBUG nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Jan 31 01:39:13 np0005603500 nova_compute[182934]: 2026-01-31 06:39:13.574 182938 INFO nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Creating image(s)
Jan 31 01:39:13 np0005603500 nova_compute[182934]: 2026-01-31 06:39:13.574 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:39:13 np0005603500 nova_compute[182934]: 2026-01-31 06:39:13.575 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:39:13 np0005603500 nova_compute[182934]: 2026-01-31 06:39:13.575 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:39:13 np0005603500 nova_compute[182934]: 2026-01-31 06:39:13.577 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:39:13 np0005603500 nova_compute[182934]: 2026-01-31 06:39:13.582 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:39:13 np0005603500 nova_compute[182934]: 2026-01-31 06:39:13.584 182938 DEBUG oslo_concurrency.processutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:39:13 np0005603500 nova_compute[182934]: 2026-01-31 06:39:13.629 182938 DEBUG oslo_concurrency.processutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:39:13 np0005603500 nova_compute[182934]: 2026-01-31 06:39:13.630 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "d9035e96dc857b84194c2a2b496d294827e2de39" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:39:13 np0005603500 nova_compute[182934]: 2026-01-31 06:39:13.630 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:39:13 np0005603500 nova_compute[182934]: 2026-01-31 06:39:13.631 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:39:13 np0005603500 nova_compute[182934]: 2026-01-31 06:39:13.635 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:39:13 np0005603500 nova_compute[182934]: 2026-01-31 06:39:13.635 182938 DEBUG oslo_concurrency.processutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:39:13 np0005603500 nova_compute[182934]: 2026-01-31 06:39:13.678 182938 DEBUG oslo_concurrency.processutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:39:13 np0005603500 nova_compute[182934]: 2026-01-31 06:39:13.679 182938 DEBUG oslo_concurrency.processutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:39:13 np0005603500 nova_compute[182934]: 2026-01-31 06:39:13.983 182938 DEBUG nova.network.neutron [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Successfully created port: df57296c-8ac3-44b0-8fb6-9f85d0a93bdc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 31 01:39:14 np0005603500 nova_compute[182934]: 2026-01-31 06:39:14.145 182938 DEBUG oslo_concurrency.processutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk 1073741824" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:39:14 np0005603500 nova_compute[182934]: 2026-01-31 06:39:14.146 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.516s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:39:14 np0005603500 nova_compute[182934]: 2026-01-31 06:39:14.147 182938 DEBUG oslo_concurrency.processutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:39:14 np0005603500 nova_compute[182934]: 2026-01-31 06:39:14.196 182938 DEBUG oslo_concurrency.processutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:39:14 np0005603500 nova_compute[182934]: 2026-01-31 06:39:14.197 182938 DEBUG nova.virt.disk.api [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Checking if we can resize image /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Jan 31 01:39:14 np0005603500 nova_compute[182934]: 2026-01-31 06:39:14.197 182938 DEBUG oslo_concurrency.processutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:39:14 np0005603500 nova_compute[182934]: 2026-01-31 06:39:14.247 182938 DEBUG oslo_concurrency.processutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:39:14 np0005603500 nova_compute[182934]: 2026-01-31 06:39:14.248 182938 DEBUG nova.virt.disk.api [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Cannot resize image /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Jan 31 01:39:14 np0005603500 nova_compute[182934]: 2026-01-31 06:39:14.249 182938 DEBUG nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Jan 31 01:39:14 np0005603500 nova_compute[182934]: 2026-01-31 06:39:14.249 182938 DEBUG nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Ensure instance console log exists: /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Jan 31 01:39:14 np0005603500 nova_compute[182934]: 2026-01-31 06:39:14.249 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:39:14 np0005603500 nova_compute[182934]: 2026-01-31 06:39:14.250 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:39:14 np0005603500 nova_compute[182934]: 2026-01-31 06:39:14.250 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:39:14 np0005603500 nova_compute[182934]: 2026-01-31 06:39:14.965 182938 DEBUG nova.network.neutron [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Successfully updated port: df57296c-8ac3-44b0-8fb6-9f85d0a93bdc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 31 01:39:15 np0005603500 nova_compute[182934]: 2026-01-31 06:39:15.224 182938 DEBUG nova.compute.manager [req-df4f0a06-2db6-4bee-857c-7d80a03513ee req-bdcdfee8-c418-4099-abd0-5deee0d3af4f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received event network-changed-df57296c-8ac3-44b0-8fb6-9f85d0a93bdc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:39:15 np0005603500 nova_compute[182934]: 2026-01-31 06:39:15.224 182938 DEBUG nova.compute.manager [req-df4f0a06-2db6-4bee-857c-7d80a03513ee req-bdcdfee8-c418-4099-abd0-5deee0d3af4f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Refreshing instance network info cache due to event network-changed-df57296c-8ac3-44b0-8fb6-9f85d0a93bdc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:39:15 np0005603500 nova_compute[182934]: 2026-01-31 06:39:15.225 182938 DEBUG oslo_concurrency.lockutils [req-df4f0a06-2db6-4bee-857c-7d80a03513ee req-bdcdfee8-c418-4099-abd0-5deee0d3af4f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:39:15 np0005603500 nova_compute[182934]: 2026-01-31 06:39:15.225 182938 DEBUG oslo_concurrency.lockutils [req-df4f0a06-2db6-4bee-857c-7d80a03513ee req-bdcdfee8-c418-4099-abd0-5deee0d3af4f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:39:15 np0005603500 nova_compute[182934]: 2026-01-31 06:39:15.225 182938 DEBUG nova.network.neutron [req-df4f0a06-2db6-4bee-857c-7d80a03513ee req-bdcdfee8-c418-4099-abd0-5deee0d3af4f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Refreshing network info cache for port df57296c-8ac3-44b0-8fb6-9f85d0a93bdc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:39:15 np0005603500 nova_compute[182934]: 2026-01-31 06:39:15.472 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:39:15 np0005603500 nova_compute[182934]: 2026-01-31 06:39:15.622 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:16 np0005603500 nova_compute[182934]: 2026-01-31 06:39:16.749 182938 DEBUG nova.network.neutron [req-df4f0a06-2db6-4bee-857c-7d80a03513ee req-bdcdfee8-c418-4099-abd0-5deee0d3af4f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:39:18 np0005603500 nova_compute[182934]: 2026-01-31 06:39:18.033 182938 DEBUG nova.network.neutron [req-df4f0a06-2db6-4bee-857c-7d80a03513ee req-bdcdfee8-c418-4099-abd0-5deee0d3af4f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:39:18 np0005603500 nova_compute[182934]: 2026-01-31 06:39:18.477 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:18 np0005603500 nova_compute[182934]: 2026-01-31 06:39:18.539 182938 DEBUG oslo_concurrency.lockutils [req-df4f0a06-2db6-4bee-857c-7d80a03513ee req-bdcdfee8-c418-4099-abd0-5deee0d3af4f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:39:18 np0005603500 nova_compute[182934]: 2026-01-31 06:39:18.540 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquired lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:39:18 np0005603500 nova_compute[182934]: 2026-01-31 06:39:18.540 182938 DEBUG nova.network.neutron [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Jan 31 01:39:19 np0005603500 nova_compute[182934]: 2026-01-31 06:39:19.416 182938 DEBUG nova.network.neutron [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:39:20 np0005603500 nova_compute[182934]: 2026-01-31 06:39:20.624 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:20 np0005603500 ovn_controller[95398]: 2026-01-31T06:39:20Z|00098|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Jan 31 01:39:22 np0005603500 nova_compute[182934]: 2026-01-31 06:39:22.739 182938 DEBUG nova.network.neutron [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Updating instance_info_cache with network_info: [{"id": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "address": "fa:16:3e:bf:6a:c9", "network": {"id": "719b4b2c-8a13-4225-a6bd-071da9c7ca99", "bridge": "br-int", "label": "tempest-network-smoke--720963365", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf57296c-8a", "ovs_interfaceid": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:39:23 np0005603500 podman[215176]: 2026-01-31 06:39:23.135489071 +0000 UTC m=+0.049004462 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent)
Jan 31 01:39:23 np0005603500 podman[215175]: 2026-01-31 06:39:23.15400934 +0000 UTC m=+0.069935627 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.244 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Releasing lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.245 182938 DEBUG nova.compute.manager [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Instance network_info: |[{"id": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "address": "fa:16:3e:bf:6a:c9", "network": {"id": "719b4b2c-8a13-4225-a6bd-071da9c7ca99", "bridge": "br-int", "label": "tempest-network-smoke--720963365", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf57296c-8a", "ovs_interfaceid": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.247 182938 DEBUG nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Start _get_guest_xml network_info=[{"id": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "address": "fa:16:3e:bf:6a:c9", "network": {"id": "719b4b2c-8a13-4225-a6bd-071da9c7ca99", "bridge": "br-int", "label": "tempest-network-smoke--720963365", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf57296c-8a", "ovs_interfaceid": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': '9f613975-b701-42a0-9b35-7d5c4a2cb7f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.250 182938 WARNING nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.251 182938 DEBUG nova.virt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-3559488', uuid='8515b99e-89c2-4a03-9a14-d6a0c3dca692'), owner=OwnerMeta(userid='dddc34b0385a49a5bd9bf081ed29e9fd', username='tempest-TestNetworkBasicOps-1355800406-project-member', projectid='829310cd8381494e96216dba067ff8d3', projectname='tempest-TestNetworkBasicOps-1355800406'), image=ImageMeta(id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "address": "fa:16:3e:bf:6a:c9", "network": {"id": "719b4b2c-8a13-4225-a6bd-071da9c7ca99", "bridge": "br-int", "label": "tempest-network-smoke--720963365", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf57296c-8a", "ovs_interfaceid": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1769841563.251051) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.259 182938 DEBUG nova.virt.libvirt.host [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.259 182938 DEBUG nova.virt.libvirt.host [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.262 182938 DEBUG nova.virt.libvirt.host [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.262 182938 DEBUG nova.virt.libvirt.host [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.263 182938 DEBUG nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.263 182938 DEBUG nova.virt.hardware [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T06:29:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9956992e-a3ca-497f-9747-3ae270e07def',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.263 182938 DEBUG nova.virt.hardware [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.263 182938 DEBUG nova.virt.hardware [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.264 182938 DEBUG nova.virt.hardware [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.264 182938 DEBUG nova.virt.hardware [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.264 182938 DEBUG nova.virt.hardware [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.264 182938 DEBUG nova.virt.hardware [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.264 182938 DEBUG nova.virt.hardware [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.264 182938 DEBUG nova.virt.hardware [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.265 182938 DEBUG nova.virt.hardware [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.265 182938 DEBUG nova.virt.hardware [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.268 182938 DEBUG nova.virt.libvirt.vif [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:39:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-3559488',display_name='tempest-TestNetworkBasicOps-server-3559488',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-3559488',id=6,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChYKDUjvhjCGs5+r/emnCrvtzxjRm0xGy2EtjvWZf5uSCdLdeFqDloA/lKvnHUNOd7mjbOXH0cnbc13Uy4w1Pw4Qnj3hkpXumiALlpO/vfZJol+WTWZCmQRTTzYN7UeJQ==',key_name='tempest-TestNetworkBasicOps-1595486681',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-gu629u0n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:39:12Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=8515b99e-89c2-4a03-9a14-d6a0c3dca692,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "address": "fa:16:3e:bf:6a:c9", "network": {"id": "719b4b2c-8a13-4225-a6bd-071da9c7ca99", "bridge": "br-int", "label": "tempest-network-smoke--720963365", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf57296c-8a", "ovs_interfaceid": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.268 182938 DEBUG nova.network.os_vif_util [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "address": "fa:16:3e:bf:6a:c9", "network": {"id": "719b4b2c-8a13-4225-a6bd-071da9c7ca99", "bridge": "br-int", "label": "tempest-network-smoke--720963365", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf57296c-8a", "ovs_interfaceid": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.269 182938 DEBUG nova.network.os_vif_util [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:6a:c9,bridge_name='br-int',has_traffic_filtering=True,id=df57296c-8ac3-44b0-8fb6-9f85d0a93bdc,network=Network(719b4b2c-8a13-4225-a6bd-071da9c7ca99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf57296c-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.269 182938 DEBUG nova.objects.instance [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8515b99e-89c2-4a03-9a14-d6a0c3dca692 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.478 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.777 182938 DEBUG nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] End _get_guest_xml xml=<domain type="kvm">
Jan 31 01:39:23 np0005603500 nova_compute[182934]:  <uuid>8515b99e-89c2-4a03-9a14-d6a0c3dca692</uuid>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:  <name>instance-00000006</name>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:  <memory>131072</memory>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:  <vcpu>1</vcpu>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <nova:name>tempest-TestNetworkBasicOps-server-3559488</nova:name>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <nova:creationTime>2026-01-31 06:39:23</nova:creationTime>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <nova:flavor name="m1.nano">
Jan 31 01:39:23 np0005603500 nova_compute[182934]:        <nova:memory>128</nova:memory>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:        <nova:disk>1</nova:disk>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:        <nova:swap>0</nova:swap>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:        <nova:vcpus>1</nova:vcpus>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      </nova:flavor>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <nova:owner>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:        <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:        <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      </nova:owner>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <nova:ports>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:        <nova:port uuid="df57296c-8ac3-44b0-8fb6-9f85d0a93bdc">
Jan 31 01:39:23 np0005603500 nova_compute[182934]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:        </nova:port>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      </nova:ports>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    </nova:instance>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:  <sysinfo type="smbios">
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <entry name="manufacturer">RDO</entry>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <entry name="product">OpenStack Compute</entry>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <entry name="serial">8515b99e-89c2-4a03-9a14-d6a0c3dca692</entry>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <entry name="uuid">8515b99e-89c2-4a03-9a14-d6a0c3dca692</entry>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <entry name="family">Virtual Machine</entry>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <boot dev="hd"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <smbios mode="sysinfo"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <vmcoreinfo/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:  <clock offset="utc">
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <timer name="hpet" present="no"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:  <cpu mode="host-model" match="exact">
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <disk type="file" device="disk">
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <target dev="vda" bus="virtio"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <disk type="file" device="cdrom">
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <driver name="qemu" type="raw" cache="none"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.config"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <target dev="sda" bus="sata"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <interface type="ethernet">
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <mac address="fa:16:3e:bf:6a:c9"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <mtu size="1442"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <target dev="tapdf57296c-8a"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <serial type="pty">
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <log file="/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/console.log" append="off"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <input type="tablet" bus="usb"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <rng model="virtio">
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <backend model="random">/dev/urandom</backend>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <controller type="usb" index="0"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    <memballoon model="virtio">
Jan 31 01:39:23 np0005603500 nova_compute[182934]:      <stats period="10"/>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:39:23 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:39:23 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:39:23 np0005603500 nova_compute[182934]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.778 182938 DEBUG nova.compute.manager [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Preparing to wait for external event network-vif-plugged-df57296c-8ac3-44b0-8fb6-9f85d0a93bdc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.778 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.778 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.779 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.779 182938 DEBUG nova.virt.libvirt.vif [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:39:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-3559488',display_name='tempest-TestNetworkBasicOps-server-3559488',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-3559488',id=6,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChYKDUjvhjCGs5+r/emnCrvtzxjRm0xGy2EtjvWZf5uSCdLdeFqDloA/lKvnHUNOd7mjbOXH0cnbc13Uy4w1Pw4Qnj3hkpXumiALlpO/vfZJol+WTWZCmQRTTzYN7UeJQ==',key_name='tempest-TestNetworkBasicOps-1595486681',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-gu629u0n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:39:12Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=8515b99e-89c2-4a03-9a14-d6a0c3dca692,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "address": "fa:16:3e:bf:6a:c9", "network": {"id": "719b4b2c-8a13-4225-a6bd-071da9c7ca99", "bridge": "br-int", "label": "tempest-network-smoke--720963365", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf57296c-8a", "ovs_interfaceid": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.780 182938 DEBUG nova.network.os_vif_util [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "address": "fa:16:3e:bf:6a:c9", "network": {"id": "719b4b2c-8a13-4225-a6bd-071da9c7ca99", "bridge": "br-int", "label": "tempest-network-smoke--720963365", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf57296c-8a", "ovs_interfaceid": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.780 182938 DEBUG nova.network.os_vif_util [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:6a:c9,bridge_name='br-int',has_traffic_filtering=True,id=df57296c-8ac3-44b0-8fb6-9f85d0a93bdc,network=Network(719b4b2c-8a13-4225-a6bd-071da9c7ca99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf57296c-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.781 182938 DEBUG os_vif [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:6a:c9,bridge_name='br-int',has_traffic_filtering=True,id=df57296c-8ac3-44b0-8fb6-9f85d0a93bdc,network=Network(719b4b2c-8a13-4225-a6bd-071da9c7ca99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf57296c-8a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.781 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.782 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.782 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.783 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.783 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '5d640a01-026b-5036-8086-0c6edb8d9793', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.784 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.785 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.788 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.788 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf57296c-8a, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.788 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapdf57296c-8a, col_values=(('qos', UUID('61772f54-5353-45af-a365-cfa53eba2e42')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.789 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapdf57296c-8a, col_values=(('external_ids', {'iface-id': 'df57296c-8ac3-44b0-8fb6-9f85d0a93bdc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bf:6a:c9', 'vm-uuid': '8515b99e-89c2-4a03-9a14-d6a0c3dca692'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.790 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:23 np0005603500 NetworkManager[55506]: <info>  [1769841563.7910] manager: (tapdf57296c-8a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.792 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.794 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:23 np0005603500 nova_compute[182934]: 2026-01-31 06:39:23.795 182938 INFO os_vif [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:6a:c9,bridge_name='br-int',has_traffic_filtering=True,id=df57296c-8ac3-44b0-8fb6-9f85d0a93bdc,network=Network(719b4b2c-8a13-4225-a6bd-071da9c7ca99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf57296c-8a')
Jan 31 01:39:25 np0005603500 nova_compute[182934]: 2026-01-31 06:39:25.326 182938 DEBUG nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:39:25 np0005603500 nova_compute[182934]: 2026-01-31 06:39:25.327 182938 DEBUG nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:39:25 np0005603500 nova_compute[182934]: 2026-01-31 06:39:25.327 182938 DEBUG nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No VIF found with MAC fa:16:3e:bf:6a:c9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Jan 31 01:39:25 np0005603500 nova_compute[182934]: 2026-01-31 06:39:25.328 182938 INFO nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Using config drive
Jan 31 01:39:27 np0005603500 nova_compute[182934]: 2026-01-31 06:39:27.096 182938 INFO nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Creating config drive at /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.config
Jan 31 01:39:27 np0005603500 nova_compute[182934]: 2026-01-31 06:39:27.100 182938 DEBUG oslo_concurrency.processutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpk1gdgs59 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:39:27 np0005603500 nova_compute[182934]: 2026-01-31 06:39:27.221 182938 DEBUG oslo_concurrency.processutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpk1gdgs59" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:39:27 np0005603500 NetworkManager[55506]: <info>  [1769841567.2683] manager: (tapdf57296c-8a): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Jan 31 01:39:27 np0005603500 kernel: tapdf57296c-8a: entered promiscuous mode
Jan 31 01:39:27 np0005603500 nova_compute[182934]: 2026-01-31 06:39:27.269 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:27 np0005603500 ovn_controller[95398]: 2026-01-31T06:39:27Z|00099|binding|INFO|Claiming lport df57296c-8ac3-44b0-8fb6-9f85d0a93bdc for this chassis.
Jan 31 01:39:27 np0005603500 ovn_controller[95398]: 2026-01-31T06:39:27Z|00100|binding|INFO|df57296c-8ac3-44b0-8fb6-9f85d0a93bdc: Claiming fa:16:3e:bf:6a:c9 10.100.0.14
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.285 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:6a:c9 10.100.0.14'], port_security=['fa:16:3e:bf:6a:c9 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '8515b99e-89c2-4a03-9a14-d6a0c3dca692', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-719b4b2c-8a13-4225-a6bd-071da9c7ca99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6059e68c-416d-405d-841f-1d281e86dc7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ef63d3b7-bb4b-4593-95e1-75f2cdd31d5e, chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=df57296c-8ac3-44b0-8fb6-9f85d0a93bdc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.287 104644 INFO neutron.agent.ovn.metadata.agent [-] Port df57296c-8ac3-44b0-8fb6-9f85d0a93bdc in datapath 719b4b2c-8a13-4225-a6bd-071da9c7ca99 bound to our chassis
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.288 104644 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 719b4b2c-8a13-4225-a6bd-071da9c7ca99
Jan 31 01:39:27 np0005603500 nova_compute[182934]: 2026-01-31 06:39:27.292 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:27 np0005603500 systemd-udevd[215237]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.298 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[a0ab082e-659e-4887-8ca7-25f49cac9a1f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.298 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap719b4b2c-81 in ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Jan 31 01:39:27 np0005603500 ovn_controller[95398]: 2026-01-31T06:39:27Z|00101|binding|INFO|Setting lport df57296c-8ac3-44b0-8fb6-9f85d0a93bdc ovn-installed in OVS
Jan 31 01:39:27 np0005603500 ovn_controller[95398]: 2026-01-31T06:39:27Z|00102|binding|INFO|Setting lport df57296c-8ac3-44b0-8fb6-9f85d0a93bdc up in Southbound
Jan 31 01:39:27 np0005603500 nova_compute[182934]: 2026-01-31 06:39:27.300 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.300 210946 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap719b4b2c-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.301 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[9e97ebd9-a1b4-4a50-88be-13a6799fc23b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.302 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[60aa13e3-47b2-48c4-9153-bff4d5f5373d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:39:27 np0005603500 systemd-machined[154375]: New machine qemu-6-instance-00000006.
Jan 31 01:39:27 np0005603500 NetworkManager[55506]: <info>  [1769841567.3122] device (tapdf57296c-8a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:39:27 np0005603500 NetworkManager[55506]: <info>  [1769841567.3129] device (tapdf57296c-8a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.311 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[083e0abe-345b-4fed-b032-21755de1be25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:39:27 np0005603500 systemd[1]: Started Virtual Machine qemu-6-instance-00000006.
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.324 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[2d376a64-312d-491e-807e-08d1541496de]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.346 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[0222b60f-4eb3-4b55-8a1e-3a10e475aedc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:39:27 np0005603500 NetworkManager[55506]: <info>  [1769841567.3510] manager: (tap719b4b2c-80): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.350 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[7b8f3ce9-75d6-453f-9b4b-1b61cee8f858]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.371 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[1312b40e-0f3a-445a-aede-8e701ab63deb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.373 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[bdb6cf60-e79f-40a8-9107-373c52124365]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:39:27 np0005603500 NetworkManager[55506]: <info>  [1769841567.3903] device (tap719b4b2c-80): carrier: link connected
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.395 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[e3a2801e-2c5c-45ef-8dfb-3ca80c23c96a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.408 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[0de44679-1e78-407d-b041-4b732fafb16a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap719b4b2c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f8:73:85'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387868, 'reachable_time': 24537, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215271, 'error': None, 'target': 'ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.417 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[4b4b792f-6751-47b1-bbdc-2526a2a94a5a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef8:7385'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387868, 'tstamp': 387868}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215272, 'error': None, 'target': 'ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.426 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[c77c3d0a-9025-457d-a934-79f46d0962ec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap719b4b2c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f8:73:85'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387868, 'reachable_time': 24537, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215273, 'error': None, 'target': 'ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.441 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[9e2788c3-3157-4170-b752-1623a839ba7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.474 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[1d0dd3db-cf4b-479c-b1f6-67ac63eb56d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.475 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap719b4b2c-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.476 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.476 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap719b4b2c-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:39:27 np0005603500 kernel: tap719b4b2c-80: entered promiscuous mode
Jan 31 01:39:27 np0005603500 NetworkManager[55506]: <info>  [1769841567.4791] manager: (tap719b4b2c-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Jan 31 01:39:27 np0005603500 nova_compute[182934]: 2026-01-31 06:39:27.479 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.481 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap719b4b2c-80, col_values=(('external_ids', {'iface-id': '2f0d159b-8dc8-4f02-92d2-05f0b9cc7315'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:39:27 np0005603500 nova_compute[182934]: 2026-01-31 06:39:27.482 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:27 np0005603500 ovn_controller[95398]: 2026-01-31T06:39:27Z|00103|binding|INFO|Releasing lport 2f0d159b-8dc8-4f02-92d2-05f0b9cc7315 from this chassis (sb_readonly=0)
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.484 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[357aba8e-5cb2-48f4-9e48-c1eea9f4ea14]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.485 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/719b4b2c-8a13-4225-a6bd-071da9c7ca99.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/719b4b2c-8a13-4225-a6bd-071da9c7ca99.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.485 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/719b4b2c-8a13-4225-a6bd-071da9c7ca99.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/719b4b2c-8a13-4225-a6bd-071da9c7ca99.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.485 104644 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 719b4b2c-8a13-4225-a6bd-071da9c7ca99 disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.485 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/719b4b2c-8a13-4225-a6bd-071da9c7ca99.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/719b4b2c-8a13-4225-a6bd-071da9c7ca99.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:39:27 np0005603500 nova_compute[182934]: 2026-01-31 06:39:27.486 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.487 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[9af393c7-2fcb-4014-b3a1-9e54748dd2b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.487 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/719b4b2c-8a13-4225-a6bd-071da9c7ca99.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/719b4b2c-8a13-4225-a6bd-071da9c7ca99.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.488 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[bf892720-0d04-4025-a84d-3a169c28e486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.488 104644 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: global
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    log         /dev/log local0 debug
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    log-tag     haproxy-metadata-proxy-719b4b2c-8a13-4225-a6bd-071da9c7ca99
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    user        root
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    group       root
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    maxconn     1024
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    pidfile     /var/lib/neutron/external/pids/719b4b2c-8a13-4225-a6bd-071da9c7ca99.pid.haproxy
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    daemon
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: defaults
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    log global
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    mode http
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    option httplog
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    option dontlognull
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    option http-server-close
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    option forwardfor
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    retries                 3
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    timeout http-request    30s
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    timeout connect         30s
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    timeout client          32s
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    timeout server          32s
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    timeout http-keep-alive 30s
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: listen listener
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    bind 169.254.169.254:80
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]:    http-request add-header X-OVN-Network-ID 719b4b2c-8a13-4225-a6bd-071da9c7ca99
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 31 01:39:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:27.489 104644 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99', 'env', 'PROCESS_TAG=haproxy-719b4b2c-8a13-4225-a6bd-071da9c7ca99', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/719b4b2c-8a13-4225-a6bd-071da9c7ca99.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Jan 31 01:39:27 np0005603500 podman[215310]: 2026-01-31 06:39:27.821338422 +0000 UTC m=+0.022679843 image pull d52ce0b189025039ce86fc9564595bcce243e95c598f912f021ea09cd4116a16 quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:39:28 np0005603500 podman[215310]: 2026-01-31 06:39:28.021443804 +0000 UTC m=+0.222785205 container create 0ed8827d1ab41d4cb3ceb10efd454f5157529fdc45b98227d53a4dba5cd37ff2 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 31 01:39:28 np0005603500 nova_compute[182934]: 2026-01-31 06:39:28.024 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:28 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:28.025 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:39:28 np0005603500 nova_compute[182934]: 2026-01-31 06:39:28.051 182938 DEBUG nova.compute.manager [req-946086d0-b549-49bb-a96d-f4593ebfb41c req-1305c6d4-db82-47b6-924f-3bf641fa68f1 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received event network-vif-plugged-df57296c-8ac3-44b0-8fb6-9f85d0a93bdc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:39:28 np0005603500 nova_compute[182934]: 2026-01-31 06:39:28.051 182938 DEBUG oslo_concurrency.lockutils [req-946086d0-b549-49bb-a96d-f4593ebfb41c req-1305c6d4-db82-47b6-924f-3bf641fa68f1 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:39:28 np0005603500 nova_compute[182934]: 2026-01-31 06:39:28.051 182938 DEBUG oslo_concurrency.lockutils [req-946086d0-b549-49bb-a96d-f4593ebfb41c req-1305c6d4-db82-47b6-924f-3bf641fa68f1 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:39:28 np0005603500 nova_compute[182934]: 2026-01-31 06:39:28.052 182938 DEBUG oslo_concurrency.lockutils [req-946086d0-b549-49bb-a96d-f4593ebfb41c req-1305c6d4-db82-47b6-924f-3bf641fa68f1 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:39:28 np0005603500 nova_compute[182934]: 2026-01-31 06:39:28.052 182938 DEBUG nova.compute.manager [req-946086d0-b549-49bb-a96d-f4593ebfb41c req-1305c6d4-db82-47b6-924f-3bf641fa68f1 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Processing event network-vif-plugged-df57296c-8ac3-44b0-8fb6-9f85d0a93bdc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Jan 31 01:39:28 np0005603500 nova_compute[182934]: 2026-01-31 06:39:28.053 182938 DEBUG nova.compute.manager [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Jan 31 01:39:28 np0005603500 nova_compute[182934]: 2026-01-31 06:39:28.056 182938 DEBUG nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Jan 31 01:39:28 np0005603500 nova_compute[182934]: 2026-01-31 06:39:28.061 182938 INFO nova.virt.libvirt.driver [-] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Instance spawned successfully.
Jan 31 01:39:28 np0005603500 nova_compute[182934]: 2026-01-31 06:39:28.062 182938 DEBUG nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Jan 31 01:39:28 np0005603500 systemd[1]: Started libpod-conmon-0ed8827d1ab41d4cb3ceb10efd454f5157529fdc45b98227d53a4dba5cd37ff2.scope.
Jan 31 01:39:28 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:39:28 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dfc1cc840a7aff4c3b5038ce5865bdb6f5903485bbe17a8486b932aea2129e7c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 01:39:28 np0005603500 podman[215310]: 2026-01-31 06:39:28.250467827 +0000 UTC m=+0.451809248 container init 0ed8827d1ab41d4cb3ceb10efd454f5157529fdc45b98227d53a4dba5cd37ff2 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 31 01:39:28 np0005603500 podman[215310]: 2026-01-31 06:39:28.255756005 +0000 UTC m=+0.457097406 container start 0ed8827d1ab41d4cb3ceb10efd454f5157529fdc45b98227d53a4dba5cd37ff2 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.license=GPLv2)
Jan 31 01:39:28 np0005603500 neutron-haproxy-ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99[215326]: [NOTICE]   (215330) : New worker (215332) forked
Jan 31 01:39:28 np0005603500 neutron-haproxy-ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99[215326]: [NOTICE]   (215330) : Loading success.
Jan 31 01:39:28 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:28.433 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:39:28 np0005603500 nova_compute[182934]: 2026-01-31 06:39:28.480 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:28 np0005603500 nova_compute[182934]: 2026-01-31 06:39:28.603 182938 DEBUG nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:39:28 np0005603500 nova_compute[182934]: 2026-01-31 06:39:28.603 182938 DEBUG nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:39:28 np0005603500 nova_compute[182934]: 2026-01-31 06:39:28.604 182938 DEBUG nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:39:28 np0005603500 nova_compute[182934]: 2026-01-31 06:39:28.604 182938 DEBUG nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:39:28 np0005603500 nova_compute[182934]: 2026-01-31 06:39:28.604 182938 DEBUG nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:39:28 np0005603500 nova_compute[182934]: 2026-01-31 06:39:28.605 182938 DEBUG nova.virt.libvirt.driver [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:39:28 np0005603500 nova_compute[182934]: 2026-01-31 06:39:28.790 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:29 np0005603500 nova_compute[182934]: 2026-01-31 06:39:29.156 182938 INFO nova.compute.manager [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Took 15.58 seconds to spawn the instance on the hypervisor.
Jan 31 01:39:29 np0005603500 nova_compute[182934]: 2026-01-31 06:39:29.156 182938 DEBUG nova.compute.manager [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Jan 31 01:39:29 np0005603500 nova_compute[182934]: 2026-01-31 06:39:29.705 182938 INFO nova.compute.manager [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Took 20.79 seconds to build instance.
Jan 31 01:39:30 np0005603500 nova_compute[182934]: 2026-01-31 06:39:30.240 182938 DEBUG oslo_concurrency.lockutils [None req-120e37ef-bde4-4223-afb3-c35e51781a80 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:39:30 np0005603500 nova_compute[182934]: 2026-01-31 06:39:30.320 182938 DEBUG nova.compute.manager [req-3cfc3c78-0a24-4b37-9955-11ae1fc4b508 req-55780dbc-8d1f-46ed-9b9f-4e960ad146c3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received event network-vif-plugged-df57296c-8ac3-44b0-8fb6-9f85d0a93bdc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:39:30 np0005603500 nova_compute[182934]: 2026-01-31 06:39:30.320 182938 DEBUG oslo_concurrency.lockutils [req-3cfc3c78-0a24-4b37-9955-11ae1fc4b508 req-55780dbc-8d1f-46ed-9b9f-4e960ad146c3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:39:30 np0005603500 nova_compute[182934]: 2026-01-31 06:39:30.321 182938 DEBUG oslo_concurrency.lockutils [req-3cfc3c78-0a24-4b37-9955-11ae1fc4b508 req-55780dbc-8d1f-46ed-9b9f-4e960ad146c3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:39:30 np0005603500 nova_compute[182934]: 2026-01-31 06:39:30.321 182938 DEBUG oslo_concurrency.lockutils [req-3cfc3c78-0a24-4b37-9955-11ae1fc4b508 req-55780dbc-8d1f-46ed-9b9f-4e960ad146c3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:39:30 np0005603500 nova_compute[182934]: 2026-01-31 06:39:30.321 182938 DEBUG nova.compute.manager [req-3cfc3c78-0a24-4b37-9955-11ae1fc4b508 req-55780dbc-8d1f-46ed-9b9f-4e960ad146c3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] No waiting events found dispatching network-vif-plugged-df57296c-8ac3-44b0-8fb6-9f85d0a93bdc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:39:30 np0005603500 nova_compute[182934]: 2026-01-31 06:39:30.322 182938 WARNING nova.compute.manager [req-3cfc3c78-0a24-4b37-9955-11ae1fc4b508 req-55780dbc-8d1f-46ed-9b9f-4e960ad146c3 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received unexpected event network-vif-plugged-df57296c-8ac3-44b0-8fb6-9f85d0a93bdc for instance with vm_state active and task_state None.
Jan 31 01:39:31 np0005603500 podman[215342]: 2026-01-31 06:39:31.130389262 +0000 UTC m=+0.048415093 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, vcs-type=git, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 31 01:39:31 np0005603500 nova_compute[182934]: 2026-01-31 06:39:31.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:39:32 np0005603500 nova_compute[182934]: 2026-01-31 06:39:32.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:39:32 np0005603500 podman[215363]: 2026-01-31 06:39:32.185976355 +0000 UTC m=+0.097112003 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller)
Jan 31 01:39:33 np0005603500 nova_compute[182934]: 2026-01-31 06:39:33.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:39:33 np0005603500 nova_compute[182934]: 2026-01-31 06:39:33.147 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:39:33 np0005603500 nova_compute[182934]: 2026-01-31 06:39:33.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:39:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:33.435 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:39:33 np0005603500 nova_compute[182934]: 2026-01-31 06:39:33.482 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:33 np0005603500 nova_compute[182934]: 2026-01-31 06:39:33.660 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:39:33 np0005603500 nova_compute[182934]: 2026-01-31 06:39:33.661 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:39:33 np0005603500 nova_compute[182934]: 2026-01-31 06:39:33.661 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:39:33 np0005603500 nova_compute[182934]: 2026-01-31 06:39:33.661 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:39:33 np0005603500 nova_compute[182934]: 2026-01-31 06:39:33.793 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:34 np0005603500 nova_compute[182934]: 2026-01-31 06:39:34.699 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:39:34 np0005603500 nova_compute[182934]: 2026-01-31 06:39:34.752 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:39:34 np0005603500 nova_compute[182934]: 2026-01-31 06:39:34.753 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:39:34 np0005603500 nova_compute[182934]: 2026-01-31 06:39:34.806 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:39:34 np0005603500 nova_compute[182934]: 2026-01-31 06:39:34.940 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:39:34 np0005603500 nova_compute[182934]: 2026-01-31 06:39:34.941 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5618MB free_disk=73.21136474609375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:39:34 np0005603500 nova_compute[182934]: 2026-01-31 06:39:34.941 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:39:34 np0005603500 nova_compute[182934]: 2026-01-31 06:39:34.942 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:39:36 np0005603500 NetworkManager[55506]: <info>  [1769841575.9961] manager: (patch-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Jan 31 01:39:36 np0005603500 NetworkManager[55506]: <info>  [1769841575.9970] manager: (patch-br-int-to-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Jan 31 01:39:36 np0005603500 nova_compute[182934]: 2026-01-31 06:39:35.997 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:36 np0005603500 ovn_controller[95398]: 2026-01-31T06:39:35Z|00104|binding|INFO|Releasing lport 2f0d159b-8dc8-4f02-92d2-05f0b9cc7315 from this chassis (sb_readonly=0)
Jan 31 01:39:36 np0005603500 nova_compute[182934]: 2026-01-31 06:39:36.002 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Instance 8515b99e-89c2-4a03-9a14-d6a0c3dca692 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Jan 31 01:39:36 np0005603500 nova_compute[182934]: 2026-01-31 06:39:36.002 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:39:36 np0005603500 nova_compute[182934]: 2026-01-31 06:39:36.003 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:39:36 np0005603500 ovn_controller[95398]: 2026-01-31T06:39:36Z|00105|binding|INFO|Releasing lport 2f0d159b-8dc8-4f02-92d2-05f0b9cc7315 from this chassis (sb_readonly=0)
Jan 31 01:39:36 np0005603500 nova_compute[182934]: 2026-01-31 06:39:36.006 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:36 np0005603500 nova_compute[182934]: 2026-01-31 06:39:36.011 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:36 np0005603500 nova_compute[182934]: 2026-01-31 06:39:36.045 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:39:36 np0005603500 nova_compute[182934]: 2026-01-31 06:39:36.554 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:39:36 np0005603500 nova_compute[182934]: 2026-01-31 06:39:36.957 182938 DEBUG nova.compute.manager [req-80d87909-6448-4e4a-9605-44bf5e3fb76a req-1765e9be-832c-42bf-9892-80e9f8e61b7c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received event network-changed-df57296c-8ac3-44b0-8fb6-9f85d0a93bdc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:39:36 np0005603500 nova_compute[182934]: 2026-01-31 06:39:36.957 182938 DEBUG nova.compute.manager [req-80d87909-6448-4e4a-9605-44bf5e3fb76a req-1765e9be-832c-42bf-9892-80e9f8e61b7c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Refreshing instance network info cache due to event network-changed-df57296c-8ac3-44b0-8fb6-9f85d0a93bdc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:39:36 np0005603500 nova_compute[182934]: 2026-01-31 06:39:36.958 182938 DEBUG oslo_concurrency.lockutils [req-80d87909-6448-4e4a-9605-44bf5e3fb76a req-1765e9be-832c-42bf-9892-80e9f8e61b7c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:39:36 np0005603500 nova_compute[182934]: 2026-01-31 06:39:36.958 182938 DEBUG oslo_concurrency.lockutils [req-80d87909-6448-4e4a-9605-44bf5e3fb76a req-1765e9be-832c-42bf-9892-80e9f8e61b7c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:39:36 np0005603500 nova_compute[182934]: 2026-01-31 06:39:36.958 182938 DEBUG nova.network.neutron [req-80d87909-6448-4e4a-9605-44bf5e3fb76a req-1765e9be-832c-42bf-9892-80e9f8e61b7c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Refreshing network info cache for port df57296c-8ac3-44b0-8fb6-9f85d0a93bdc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:39:37 np0005603500 nova_compute[182934]: 2026-01-31 06:39:37.079 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:39:37 np0005603500 nova_compute[182934]: 2026-01-31 06:39:37.080 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:39:38 np0005603500 nova_compute[182934]: 2026-01-31 06:39:38.081 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:39:38 np0005603500 nova_compute[182934]: 2026-01-31 06:39:38.082 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:39:38 np0005603500 nova_compute[182934]: 2026-01-31 06:39:38.484 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:38 np0005603500 nova_compute[182934]: 2026-01-31 06:39:38.606 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:39:38 np0005603500 nova_compute[182934]: 2026-01-31 06:39:38.607 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:39:38 np0005603500 nova_compute[182934]: 2026-01-31 06:39:38.795 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:39 np0005603500 podman[215398]: 2026-01-31 06:39:39.163930454 +0000 UTC m=+0.085016358 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:39:41 np0005603500 podman[215435]: 2026-01-31 06:39:41.1460159 +0000 UTC m=+0.054094773 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 31 01:39:42 np0005603500 nova_compute[182934]: 2026-01-31 06:39:42.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:39:42 np0005603500 nova_compute[182934]: 2026-01-31 06:39:42.166 182938 DEBUG nova.network.neutron [req-80d87909-6448-4e4a-9605-44bf5e3fb76a req-1765e9be-832c-42bf-9892-80e9f8e61b7c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Updated VIF entry in instance network info cache for port df57296c-8ac3-44b0-8fb6-9f85d0a93bdc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:39:42 np0005603500 nova_compute[182934]: 2026-01-31 06:39:42.166 182938 DEBUG nova.network.neutron [req-80d87909-6448-4e4a-9605-44bf5e3fb76a req-1765e9be-832c-42bf-9892-80e9f8e61b7c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Updating instance_info_cache with network_info: [{"id": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "address": "fa:16:3e:bf:6a:c9", "network": {"id": "719b4b2c-8a13-4225-a6bd-071da9c7ca99", "bridge": "br-int", "label": "tempest-network-smoke--720963365", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf57296c-8a", "ovs_interfaceid": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:39:42 np0005603500 ovn_controller[95398]: 2026-01-31T06:39:42Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bf:6a:c9 10.100.0.14
Jan 31 01:39:42 np0005603500 ovn_controller[95398]: 2026-01-31T06:39:42Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bf:6a:c9 10.100.0.14
Jan 31 01:39:42 np0005603500 nova_compute[182934]: 2026-01-31 06:39:42.732 182938 DEBUG oslo_concurrency.lockutils [req-80d87909-6448-4e4a-9605-44bf5e3fb76a req-1765e9be-832c-42bf-9892-80e9f8e61b7c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:39:43 np0005603500 nova_compute[182934]: 2026-01-31 06:39:43.486 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:43 np0005603500 nova_compute[182934]: 2026-01-31 06:39:43.797 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:48 np0005603500 nova_compute[182934]: 2026-01-31 06:39:48.488 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:48 np0005603500 nova_compute[182934]: 2026-01-31 06:39:48.798 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:49 np0005603500 nova_compute[182934]: 2026-01-31 06:39:49.398 182938 INFO nova.compute.manager [None req-5729165f-1b8f-43b8-800f-975602cac084 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Get console output
Jan 31 01:39:49 np0005603500 nova_compute[182934]: 2026-01-31 06:39:49.403 211654 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 31 01:39:53 np0005603500 nova_compute[182934]: 2026-01-31 06:39:53.490 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:53 np0005603500 nova_compute[182934]: 2026-01-31 06:39:53.801 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:54.117 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:59:d7:fb 10.100.0.17'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.17/28', 'neutron:device_id': 'ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e0e18fe-0b80-4494-a6db-546b6daf5fd2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07de4316-ef81-43cb-97c3-675146f1c643, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=38074b11-3b13-4706-9a79-123dd621d667) old=Port_Binding(mac=['fa:16:3e:59:d7:fb'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e0e18fe-0b80-4494-a6db-546b6daf5fd2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:39:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:54.119 104644 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 38074b11-3b13-4706-9a79-123dd621d667 in datapath 8e0e18fe-0b80-4494-a6db-546b6daf5fd2 updated
Jan 31 01:39:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:54.120 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e0e18fe-0b80-4494-a6db-546b6daf5fd2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:39:54 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:54.121 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[1c2ecce5-6019-4013-9812-b94b42f8a240]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:39:54 np0005603500 podman[215461]: 2026-01-31 06:39:54.136106212 +0000 UTC m=+0.052216553 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:39:54 np0005603500 podman[215460]: 2026-01-31 06:39:54.154485428 +0000 UTC m=+0.071850038 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 31 01:39:55 np0005603500 nova_compute[182934]: 2026-01-31 06:39:55.629 182938 DEBUG oslo_concurrency.lockutils [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "interface-8515b99e-89c2-4a03-9a14-d6a0c3dca692-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:39:55 np0005603500 nova_compute[182934]: 2026-01-31 06:39:55.630 182938 DEBUG oslo_concurrency.lockutils [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "interface-8515b99e-89c2-4a03-9a14-d6a0c3dca692-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:39:55 np0005603500 nova_compute[182934]: 2026-01-31 06:39:55.630 182938 DEBUG nova.objects.instance [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'flavor' on Instance uuid 8515b99e-89c2-4a03-9a14-d6a0c3dca692 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:39:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:55.958 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:39:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:55.958 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:39:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:39:55.959 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:39:57 np0005603500 nova_compute[182934]: 2026-01-31 06:39:57.116 182938 DEBUG nova.objects.instance [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'pci_requests' on Instance uuid 8515b99e-89c2-4a03-9a14-d6a0c3dca692 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:39:57 np0005603500 nova_compute[182934]: 2026-01-31 06:39:57.626 182938 DEBUG nova.objects.base [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Object Instance<8515b99e-89c2-4a03-9a14-d6a0c3dca692> lazy-loaded attributes: flavor,pci_requests wrapper /usr/lib/python3.9/site-packages/nova/objects/base.py:136
Jan 31 01:39:57 np0005603500 nova_compute[182934]: 2026-01-31 06:39:57.626 182938 DEBUG nova.network.neutron [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Jan 31 01:39:58 np0005603500 nova_compute[182934]: 2026-01-31 06:39:58.493 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:39:58 np0005603500 nova_compute[182934]: 2026-01-31 06:39:58.797 182938 DEBUG nova.policy [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '829310cd8381494e96216dba067ff8d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Jan 31 01:39:58 np0005603500 nova_compute[182934]: 2026-01-31 06:39:58.804 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:00 np0005603500 nova_compute[182934]: 2026-01-31 06:40:00.048 182938 DEBUG nova.network.neutron [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Successfully created port: 33eb9351-9906-4872-b960-8ca2037338a5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 31 01:40:01 np0005603500 nova_compute[182934]: 2026-01-31 06:40:01.343 182938 DEBUG nova.network.neutron [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Successfully updated port: 33eb9351-9906-4872-b960-8ca2037338a5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 31 01:40:01 np0005603500 nova_compute[182934]: 2026-01-31 06:40:01.589 182938 DEBUG nova.compute.manager [req-d16f1e0b-a636-47d4-9e11-11aa5d095503 req-93d03377-36ee-4acc-bc6d-a50eb1eff61f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received event network-changed-33eb9351-9906-4872-b960-8ca2037338a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:40:01 np0005603500 nova_compute[182934]: 2026-01-31 06:40:01.590 182938 DEBUG nova.compute.manager [req-d16f1e0b-a636-47d4-9e11-11aa5d095503 req-93d03377-36ee-4acc-bc6d-a50eb1eff61f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Refreshing instance network info cache due to event network-changed-33eb9351-9906-4872-b960-8ca2037338a5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:40:01 np0005603500 nova_compute[182934]: 2026-01-31 06:40:01.590 182938 DEBUG oslo_concurrency.lockutils [req-d16f1e0b-a636-47d4-9e11-11aa5d095503 req-93d03377-36ee-4acc-bc6d-a50eb1eff61f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:40:01 np0005603500 nova_compute[182934]: 2026-01-31 06:40:01.590 182938 DEBUG oslo_concurrency.lockutils [req-d16f1e0b-a636-47d4-9e11-11aa5d095503 req-93d03377-36ee-4acc-bc6d-a50eb1eff61f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:40:01 np0005603500 nova_compute[182934]: 2026-01-31 06:40:01.591 182938 DEBUG nova.network.neutron [req-d16f1e0b-a636-47d4-9e11-11aa5d095503 req-93d03377-36ee-4acc-bc6d-a50eb1eff61f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Refreshing network info cache for port 33eb9351-9906-4872-b960-8ca2037338a5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:40:01 np0005603500 nova_compute[182934]: 2026-01-31 06:40:01.851 182938 DEBUG oslo_concurrency.lockutils [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:40:02 np0005603500 podman[215501]: 2026-01-31 06:40:02.154474061 +0000 UTC m=+0.073656376 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2026-01-22T05:09:47Z, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter)
Jan 31 01:40:03 np0005603500 podman[215521]: 2026-01-31 06:40:03.145889221 +0000 UTC m=+0.061470349 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:40:03 np0005603500 nova_compute[182934]: 2026-01-31 06:40:03.548 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:03 np0005603500 nova_compute[182934]: 2026-01-31 06:40:03.806 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:06 np0005603500 nova_compute[182934]: 2026-01-31 06:40:06.772 182938 DEBUG nova.network.neutron [req-d16f1e0b-a636-47d4-9e11-11aa5d095503 req-93d03377-36ee-4acc-bc6d-a50eb1eff61f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Added VIF to instance network info cache for port 33eb9351-9906-4872-b960-8ca2037338a5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3546
Jan 31 01:40:06 np0005603500 nova_compute[182934]: 2026-01-31 06:40:06.773 182938 DEBUG nova.network.neutron [req-d16f1e0b-a636-47d4-9e11-11aa5d095503 req-93d03377-36ee-4acc-bc6d-a50eb1eff61f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Updating instance_info_cache with network_info: [{"id": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "address": "fa:16:3e:bf:6a:c9", "network": {"id": "719b4b2c-8a13-4225-a6bd-071da9c7ca99", "bridge": "br-int", "label": "tempest-network-smoke--720963365", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf57296c-8a", "ovs_interfaceid": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "33eb9351-9906-4872-b960-8ca2037338a5", "address": "fa:16:3e:26:e4:61", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33eb9351-99", "ovs_interfaceid": "33eb9351-9906-4872-b960-8ca2037338a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:40:07 np0005603500 nova_compute[182934]: 2026-01-31 06:40:07.329 182938 DEBUG oslo_concurrency.lockutils [req-d16f1e0b-a636-47d4-9e11-11aa5d095503 req-93d03377-36ee-4acc-bc6d-a50eb1eff61f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:40:07 np0005603500 nova_compute[182934]: 2026-01-31 06:40:07.330 182938 DEBUG oslo_concurrency.lockutils [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquired lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:40:07 np0005603500 nova_compute[182934]: 2026-01-31 06:40:07.331 182938 DEBUG nova.network.neutron [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Jan 31 01:40:08 np0005603500 nova_compute[182934]: 2026-01-31 06:40:08.550 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:08 np0005603500 nova_compute[182934]: 2026-01-31 06:40:08.775 182938 WARNING nova.network.neutron [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] 8e0e18fe-0b80-4494-a6db-546b6daf5fd2 already exists in list: networks containing: ['8e0e18fe-0b80-4494-a6db-546b6daf5fd2']. ignoring it
Jan 31 01:40:08 np0005603500 nova_compute[182934]: 2026-01-31 06:40:08.776 182938 WARNING nova.network.neutron [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] 33eb9351-9906-4872-b960-8ca2037338a5 already exists in list: port_ids containing: ['33eb9351-9906-4872-b960-8ca2037338a5']. ignoring it
Jan 31 01:40:08 np0005603500 nova_compute[182934]: 2026-01-31 06:40:08.807 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:10 np0005603500 podman[215548]: 2026-01-31 06:40:10.139264902 +0000 UTC m=+0.054188476 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:40:12 np0005603500 podman[215568]: 2026-01-31 06:40:12.124414645 +0000 UTC m=+0.043546427 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 31 01:40:13 np0005603500 nova_compute[182934]: 2026-01-31 06:40:13.595 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:13 np0005603500 nova_compute[182934]: 2026-01-31 06:40:13.810 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:14 np0005603500 nova_compute[182934]: 2026-01-31 06:40:14.117 182938 DEBUG nova.network.neutron [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Updating instance_info_cache with network_info: [{"id": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "address": "fa:16:3e:bf:6a:c9", "network": {"id": "719b4b2c-8a13-4225-a6bd-071da9c7ca99", "bridge": "br-int", "label": "tempest-network-smoke--720963365", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf57296c-8a", "ovs_interfaceid": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "33eb9351-9906-4872-b960-8ca2037338a5", "address": "fa:16:3e:26:e4:61", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33eb9351-99", "ovs_interfaceid": "33eb9351-9906-4872-b960-8ca2037338a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.828 182938 DEBUG oslo_concurrency.lockutils [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Releasing lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.831 182938 DEBUG nova.virt.libvirt.vif [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:39:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-3559488',display_name='tempest-TestNetworkBasicOps-server-3559488',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-3559488',id=6,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChYKDUjvhjCGs5+r/emnCrvtzxjRm0xGy2EtjvWZf5uSCdLdeFqDloA/lKvnHUNOd7mjbOXH0cnbc13Uy4w1Pw4Qnj3hkpXumiALlpO/vfZJol+WTWZCmQRTTzYN7UeJQ==',key_name='tempest-TestNetworkBasicOps-1595486681',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:39:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-gu629u0n',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:39:29Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=8515b99e-89c2-4a03-9a14-d6a0c3dca692,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "33eb9351-9906-4872-b960-8ca2037338a5", "address": "fa:16:3e:26:e4:61", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33eb9351-99", "ovs_interfaceid": "33eb9351-9906-4872-b960-8ca2037338a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.831 182938 DEBUG nova.network.os_vif_util [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "33eb9351-9906-4872-b960-8ca2037338a5", "address": "fa:16:3e:26:e4:61", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33eb9351-99", "ovs_interfaceid": "33eb9351-9906-4872-b960-8ca2037338a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.832 182938 DEBUG nova.network.os_vif_util [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:e4:61,bridge_name='br-int',has_traffic_filtering=True,id=33eb9351-9906-4872-b960-8ca2037338a5,network=Network(8e0e18fe-0b80-4494-a6db-546b6daf5fd2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33eb9351-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.832 182938 DEBUG os_vif [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:e4:61,bridge_name='br-int',has_traffic_filtering=True,id=33eb9351-9906-4872-b960-8ca2037338a5,network=Network(8e0e18fe-0b80-4494-a6db-546b6daf5fd2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33eb9351-99') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.833 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.833 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.834 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.835 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.835 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '68e2f58b-29e1-51a5-ab17-8ee9abe346ec', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.874 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.875 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.877 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.878 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap33eb9351-99, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.878 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap33eb9351-99, col_values=(('qos', UUID('c0228587-d3cb-4ae5-b27b-9ddc8f16b4a3')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.878 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap33eb9351-99, col_values=(('external_ids', {'iface-id': '33eb9351-9906-4872-b960-8ca2037338a5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:e4:61', 'vm-uuid': '8515b99e-89c2-4a03-9a14-d6a0c3dca692'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.880 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:15 np0005603500 NetworkManager[55506]: <info>  [1769841615.8815] manager: (tap33eb9351-99): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.882 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.886 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.887 182938 INFO os_vif [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:e4:61,bridge_name='br-int',has_traffic_filtering=True,id=33eb9351-9906-4872-b960-8ca2037338a5,network=Network(8e0e18fe-0b80-4494-a6db-546b6daf5fd2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33eb9351-99')
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.888 182938 DEBUG nova.virt.libvirt.vif [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:39:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-3559488',display_name='tempest-TestNetworkBasicOps-server-3559488',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-3559488',id=6,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChYKDUjvhjCGs5+r/emnCrvtzxjRm0xGy2EtjvWZf5uSCdLdeFqDloA/lKvnHUNOd7mjbOXH0cnbc13Uy4w1Pw4Qnj3hkpXumiALlpO/vfZJol+WTWZCmQRTTzYN7UeJQ==',key_name='tempest-TestNetworkBasicOps-1595486681',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:39:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-gu629u0n',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:39:29Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=8515b99e-89c2-4a03-9a14-d6a0c3dca692,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "33eb9351-9906-4872-b960-8ca2037338a5", "address": "fa:16:3e:26:e4:61", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33eb9351-99", "ovs_interfaceid": "33eb9351-9906-4872-b960-8ca2037338a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.888 182938 DEBUG nova.network.os_vif_util [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "33eb9351-9906-4872-b960-8ca2037338a5", "address": "fa:16:3e:26:e4:61", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33eb9351-99", "ovs_interfaceid": "33eb9351-9906-4872-b960-8ca2037338a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.888 182938 DEBUG nova.network.os_vif_util [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:e4:61,bridge_name='br-int',has_traffic_filtering=True,id=33eb9351-9906-4872-b960-8ca2037338a5,network=Network(8e0e18fe-0b80-4494-a6db-546b6daf5fd2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33eb9351-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.891 182938 DEBUG nova.virt.libvirt.guest [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] attach device xml: <interface type="ethernet">
Jan 31 01:40:15 np0005603500 nova_compute[182934]:  <mac address="fa:16:3e:26:e4:61"/>
Jan 31 01:40:15 np0005603500 nova_compute[182934]:  <model type="virtio"/>
Jan 31 01:40:15 np0005603500 nova_compute[182934]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 01:40:15 np0005603500 nova_compute[182934]:  <mtu size="1442"/>
Jan 31 01:40:15 np0005603500 nova_compute[182934]:  <target dev="tap33eb9351-99"/>
Jan 31 01:40:15 np0005603500 nova_compute[182934]: </interface>
Jan 31 01:40:15 np0005603500 nova_compute[182934]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:336
Jan 31 01:40:15 np0005603500 kernel: tap33eb9351-99: entered promiscuous mode
Jan 31 01:40:15 np0005603500 NetworkManager[55506]: <info>  [1769841615.9011] manager: (tap33eb9351-99): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Jan 31 01:40:15 np0005603500 ovn_controller[95398]: 2026-01-31T06:40:15Z|00106|binding|INFO|Claiming lport 33eb9351-9906-4872-b960-8ca2037338a5 for this chassis.
Jan 31 01:40:15 np0005603500 ovn_controller[95398]: 2026-01-31T06:40:15Z|00107|binding|INFO|33eb9351-9906-4872-b960-8ca2037338a5: Claiming fa:16:3e:26:e4:61 10.100.0.23
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.903 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.920 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:15 np0005603500 systemd-udevd[215598]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:40:15 np0005603500 ovn_controller[95398]: 2026-01-31T06:40:15Z|00108|binding|INFO|Setting lport 33eb9351-9906-4872-b960-8ca2037338a5 ovn-installed in OVS
Jan 31 01:40:15 np0005603500 nova_compute[182934]: 2026-01-31 06:40:15.925 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:15 np0005603500 NetworkManager[55506]: <info>  [1769841615.9333] device (tap33eb9351-99): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:40:15 np0005603500 NetworkManager[55506]: <info>  [1769841615.9345] device (tap33eb9351-99): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 01:40:16 np0005603500 ovn_controller[95398]: 2026-01-31T06:40:16Z|00109|binding|INFO|Setting lport 33eb9351-9906-4872-b960-8ca2037338a5 up in Southbound
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.300 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:e4:61 10.100.0.23'], port_security=['fa:16:3e:26:e4:61 10.100.0.23'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.23/28', 'neutron:device_id': '8515b99e-89c2-4a03-9a14-d6a0c3dca692', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e0e18fe-0b80-4494-a6db-546b6daf5fd2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '859450e8-8706-4a8f-9018-d6c18f1c21cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07de4316-ef81-43cb-97c3-675146f1c643, chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=33eb9351-9906-4872-b960-8ca2037338a5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.301 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 33eb9351-9906-4872-b960-8ca2037338a5 in datapath 8e0e18fe-0b80-4494-a6db-546b6daf5fd2 bound to our chassis
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.302 104644 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e0e18fe-0b80-4494-a6db-546b6daf5fd2
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.311 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[e474cfaf-5dc0-4bad-860e-139bacb73d49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.311 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8e0e18fe-01 in ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.313 210946 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8e0e18fe-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.313 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[0b0219c5-1185-4a33-a38f-82d2049ccf01]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.314 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[1af14d5c-e5d5-42c3-b0c8-ee44516f868a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.323 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[f2eb30ee-df8c-4a89-b925-1d7f7f76c1d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.334 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[7450bfa0-973c-43a4-9192-7bee38cf2e3f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.356 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[d19a0520-6d8e-40a4-b789-feff49e580b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.361 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[782a6ef2-3da2-4c0b-8c62-46453ddf9286]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:16 np0005603500 NetworkManager[55506]: <info>  [1769841616.3624] manager: (tap8e0e18fe-00): new Veth device (/org/freedesktop/NetworkManager/Devices/57)
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.382 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[cd7add48-40c3-4847-bc56-c2262b0c3c8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.384 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[bd099647-696b-4020-b4c1-90ccecda9525]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:16 np0005603500 NetworkManager[55506]: <info>  [1769841616.3984] device (tap8e0e18fe-00): carrier: link connected
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.401 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[a654cafa-d044-4412-b338-e99d9a69ef24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.414 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[633e0e11-db9f-4335-b292-512f2e047123]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e0e18fe-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:d7:fb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 392769, 'reachable_time': 15961, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215625, 'error': None, 'target': 'ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.427 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[2b85f865-1a5f-4075-94fd-24cc7b1ae0f4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe59:d7fb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 392769, 'tstamp': 392769}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215626, 'error': None, 'target': 'ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.439 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[02445561-eb26-4651-ac93-0627463636b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e0e18fe-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:d7:fb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 392769, 'reachable_time': 15961, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215627, 'error': None, 'target': 'ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.462 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[6674f159-5672-4806-b044-4d2668df73f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.505 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[232e2a36-273c-474f-a273-f690d5d6dda1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.507 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e0e18fe-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.507 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.507 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e0e18fe-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:40:16 np0005603500 nova_compute[182934]: 2026-01-31 06:40:16.509 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:16 np0005603500 NetworkManager[55506]: <info>  [1769841616.5097] manager: (tap8e0e18fe-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Jan 31 01:40:16 np0005603500 kernel: tap8e0e18fe-00: entered promiscuous mode
Jan 31 01:40:16 np0005603500 nova_compute[182934]: 2026-01-31 06:40:16.511 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.512 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e0e18fe-00, col_values=(('external_ids', {'iface-id': '38074b11-3b13-4706-9a79-123dd621d667'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:40:16 np0005603500 nova_compute[182934]: 2026-01-31 06:40:16.513 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:16 np0005603500 ovn_controller[95398]: 2026-01-31T06:40:16Z|00110|binding|INFO|Releasing lport 38074b11-3b13-4706-9a79-123dd621d667 from this chassis (sb_readonly=0)
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.515 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[a4b8ac78-4d48-4cbd-b579-f211dee3cfe2]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.515 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8e0e18fe-0b80-4494-a6db-546b6daf5fd2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8e0e18fe-0b80-4494-a6db-546b6daf5fd2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.516 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8e0e18fe-0b80-4494-a6db-546b6daf5fd2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8e0e18fe-0b80-4494-a6db-546b6daf5fd2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.516 104644 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 8e0e18fe-0b80-4494-a6db-546b6daf5fd2 disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.516 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8e0e18fe-0b80-4494-a6db-546b6daf5fd2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8e0e18fe-0b80-4494-a6db-546b6daf5fd2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:40:16 np0005603500 nova_compute[182934]: 2026-01-31 06:40:16.517 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.517 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[6c096300-490e-4037-af26-c316142c4f8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.518 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8e0e18fe-0b80-4494-a6db-546b6daf5fd2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8e0e18fe-0b80-4494-a6db-546b6daf5fd2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.518 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[bded33e8-a0af-44ef-954b-567bd55c1485]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.519 104644 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: global
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    log         /dev/log local0 debug
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    log-tag     haproxy-metadata-proxy-8e0e18fe-0b80-4494-a6db-546b6daf5fd2
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    user        root
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    group       root
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    maxconn     1024
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    pidfile     /var/lib/neutron/external/pids/8e0e18fe-0b80-4494-a6db-546b6daf5fd2.pid.haproxy
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    daemon
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: defaults
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    log global
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    mode http
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    option httplog
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    option dontlognull
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    option http-server-close
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    option forwardfor
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    retries                 3
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    timeout http-request    30s
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    timeout connect         30s
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    timeout client          32s
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    timeout server          32s
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    timeout http-keep-alive 30s
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: listen listener
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    bind 169.254.169.254:80
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]:    http-request add-header X-OVN-Network-ID 8e0e18fe-0b80-4494-a6db-546b6daf5fd2
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 31 01:40:16 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:16.519 104644 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2', 'env', 'PROCESS_TAG=haproxy-8e0e18fe-0b80-4494-a6db-546b6daf5fd2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8e0e18fe-0b80-4494-a6db-546b6daf5fd2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Jan 31 01:40:16 np0005603500 podman[215657]: 2026-01-31 06:40:16.802444889 +0000 UTC m=+0.021931070 image pull d52ce0b189025039ce86fc9564595bcce243e95c598f912f021ea09cd4116a16 quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:40:17 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:17.035 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:40:17 np0005603500 nova_compute[182934]: 2026-01-31 06:40:17.035 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:17 np0005603500 nova_compute[182934]: 2026-01-31 06:40:17.083 182938 DEBUG nova.compute.manager [req-a02b606c-14a3-403e-a20a-6bbe34729e2c req-84d10351-48be-4ede-8b2d-3732a704955e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received event network-vif-plugged-33eb9351-9906-4872-b960-8ca2037338a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:40:17 np0005603500 nova_compute[182934]: 2026-01-31 06:40:17.084 182938 DEBUG oslo_concurrency.lockutils [req-a02b606c-14a3-403e-a20a-6bbe34729e2c req-84d10351-48be-4ede-8b2d-3732a704955e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:40:17 np0005603500 nova_compute[182934]: 2026-01-31 06:40:17.084 182938 DEBUG oslo_concurrency.lockutils [req-a02b606c-14a3-403e-a20a-6bbe34729e2c req-84d10351-48be-4ede-8b2d-3732a704955e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:40:17 np0005603500 nova_compute[182934]: 2026-01-31 06:40:17.084 182938 DEBUG oslo_concurrency.lockutils [req-a02b606c-14a3-403e-a20a-6bbe34729e2c req-84d10351-48be-4ede-8b2d-3732a704955e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:40:17 np0005603500 nova_compute[182934]: 2026-01-31 06:40:17.084 182938 DEBUG nova.compute.manager [req-a02b606c-14a3-403e-a20a-6bbe34729e2c req-84d10351-48be-4ede-8b2d-3732a704955e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] No waiting events found dispatching network-vif-plugged-33eb9351-9906-4872-b960-8ca2037338a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:40:17 np0005603500 nova_compute[182934]: 2026-01-31 06:40:17.084 182938 WARNING nova.compute.manager [req-a02b606c-14a3-403e-a20a-6bbe34729e2c req-84d10351-48be-4ede-8b2d-3732a704955e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received unexpected event network-vif-plugged-33eb9351-9906-4872-b960-8ca2037338a5 for instance with vm_state active and task_state None.
Jan 31 01:40:17 np0005603500 podman[215657]: 2026-01-31 06:40:17.496317573 +0000 UTC m=+0.715803724 container create c17f83c4fbcc1bff6cfeb306b8050dee20f534b1bb7dd349b2da63269e88254a (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:40:17 np0005603500 ovn_controller[95398]: 2026-01-31T06:40:17Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:26:e4:61 10.100.0.23
Jan 31 01:40:17 np0005603500 ovn_controller[95398]: 2026-01-31T06:40:17Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:26:e4:61 10.100.0.23
Jan 31 01:40:17 np0005603500 systemd[1]: Started libpod-conmon-c17f83c4fbcc1bff6cfeb306b8050dee20f534b1bb7dd349b2da63269e88254a.scope.
Jan 31 01:40:17 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:40:17 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fd377f790dd66126ef63ff528e5db2accba3de3e5aeafaafeffc9138a0dd798/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 01:40:17 np0005603500 nova_compute[182934]: 2026-01-31 06:40:17.801 182938 DEBUG nova.virt.libvirt.driver [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:40:17 np0005603500 nova_compute[182934]: 2026-01-31 06:40:17.802 182938 DEBUG nova.virt.libvirt.driver [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:40:17 np0005603500 nova_compute[182934]: 2026-01-31 06:40:17.802 182938 DEBUG nova.virt.libvirt.driver [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No VIF found with MAC fa:16:3e:bf:6a:c9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Jan 31 01:40:17 np0005603500 nova_compute[182934]: 2026-01-31 06:40:17.802 182938 DEBUG nova.virt.libvirt.driver [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No VIF found with MAC fa:16:3e:26:e4:61, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Jan 31 01:40:17 np0005603500 podman[215657]: 2026-01-31 06:40:17.845329758 +0000 UTC m=+1.064815909 container init c17f83c4fbcc1bff6cfeb306b8050dee20f534b1bb7dd349b2da63269e88254a (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:40:17 np0005603500 podman[215657]: 2026-01-31 06:40:17.851827474 +0000 UTC m=+1.071313625 container start c17f83c4fbcc1bff6cfeb306b8050dee20f534b1bb7dd349b2da63269e88254a (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3)
Jan 31 01:40:17 np0005603500 neutron-haproxy-ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2[215672]: [NOTICE]   (215676) : New worker (215678) forked
Jan 31 01:40:17 np0005603500 neutron-haproxy-ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2[215672]: [NOTICE]   (215676) : Loading success.
Jan 31 01:40:17 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:17.977 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:40:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:17.983 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f199f43bca0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:17.987 16 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/servers/8515b99e-89c2-4a03-9a14-d6a0c3dca692 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}9de33c3c4c813c7413c734743528a34030291a616c281269e5092e293b0fad44" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:580
Jan 31 01:40:18 np0005603500 nova_compute[182934]: 2026-01-31 06:40:18.311 182938 DEBUG nova.virt.driver [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-3559488', uuid='8515b99e-89c2-4a03-9a14-d6a0c3dca692'), owner=OwnerMeta(userid='dddc34b0385a49a5bd9bf081ed29e9fd', username='tempest-TestNetworkBasicOps-1355800406-project-member', projectid='829310cd8381494e96216dba067ff8d3', projectname='tempest-TestNetworkBasicOps-1355800406'), image=ImageMeta(id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus='sata',hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus='virtio',hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus='usb',hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type='q35',hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model='usbtablet',hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model='virtio',hw_video_ram=<?>,hw_vif_model='virtio',hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "address": "fa:16:3e:bf:6a:c9", "network": {"id": "719b4b2c-8a13-4225-a6bd-071da9c7ca99", "bridge": "br-int", "label": "tempest-network-smoke--720963365", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf57296c-8a", "ovs_interfaceid": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "33eb9351-9906-4872-b960-8ca2037338a5", "address": "fa:16:3e:26:e4:61", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33eb9351-99", "ovs_interfaceid": "33eb9351-9906-4872-b960-8ca2037338a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1769841618.3116362) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Jan 31 01:40:18 np0005603500 nova_compute[182934]: 2026-01-31 06:40:18.312 182938 DEBUG nova.virt.libvirt.guest [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:40:18 np0005603500 nova_compute[182934]:  <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:40:18 np0005603500 nova_compute[182934]:  <nova:name>tempest-TestNetworkBasicOps-server-3559488</nova:name>
Jan 31 01:40:18 np0005603500 nova_compute[182934]:  <nova:creationTime>2026-01-31 06:40:18</nova:creationTime>
Jan 31 01:40:18 np0005603500 nova_compute[182934]:  <nova:flavor name="m1.nano">
Jan 31 01:40:18 np0005603500 nova_compute[182934]:    <nova:memory>128</nova:memory>
Jan 31 01:40:18 np0005603500 nova_compute[182934]:    <nova:disk>1</nova:disk>
Jan 31 01:40:18 np0005603500 nova_compute[182934]:    <nova:swap>0</nova:swap>
Jan 31 01:40:18 np0005603500 nova_compute[182934]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:40:18 np0005603500 nova_compute[182934]:    <nova:vcpus>1</nova:vcpus>
Jan 31 01:40:18 np0005603500 nova_compute[182934]:  </nova:flavor>
Jan 31 01:40:18 np0005603500 nova_compute[182934]:  <nova:owner>
Jan 31 01:40:18 np0005603500 nova_compute[182934]:    <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:40:18 np0005603500 nova_compute[182934]:    <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:40:18 np0005603500 nova_compute[182934]:  </nova:owner>
Jan 31 01:40:18 np0005603500 nova_compute[182934]:  <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:40:18 np0005603500 nova_compute[182934]:  <nova:ports>
Jan 31 01:40:18 np0005603500 nova_compute[182934]:    <nova:port uuid="df57296c-8ac3-44b0-8fb6-9f85d0a93bdc">
Jan 31 01:40:18 np0005603500 nova_compute[182934]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 01:40:18 np0005603500 nova_compute[182934]:    </nova:port>
Jan 31 01:40:18 np0005603500 nova_compute[182934]:    <nova:port uuid="33eb9351-9906-4872-b960-8ca2037338a5">
Jan 31 01:40:18 np0005603500 nova_compute[182934]:      <nova:ip type="fixed" address="10.100.0.23" ipVersion="4"/>
Jan 31 01:40:18 np0005603500 nova_compute[182934]:    </nova:port>
Jan 31 01:40:18 np0005603500 nova_compute[182934]:  </nova:ports>
Jan 31 01:40:18 np0005603500 nova_compute[182934]: </nova:instance>
Jan 31 01:40:18 np0005603500 nova_compute[182934]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:356
Jan 31 01:40:18 np0005603500 nova_compute[182934]: 2026-01-31 06:40:18.599 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:18 np0005603500 nova_compute[182934]: 2026-01-31 06:40:18.827 182938 DEBUG oslo_concurrency.lockutils [None req-e0c876e8-41b0-43a8-9569-6fab83bba198 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "interface-8515b99e-89c2-4a03-9a14-d6a0c3dca692-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 23.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.681 16 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 2147 Content-Type: application/json Date: Sat, 31 Jan 2026 06:40:18 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-f68c8953-9e53-48fd-bb45-c9b8f300c7ce x-openstack-request-id: req-f68c8953-9e53-48fd-bb45-c9b8f300c7ce _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:621
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.681 16 DEBUG novaclient.v2.client [-] RESP BODY: {"server": {"id": "8515b99e-89c2-4a03-9a14-d6a0c3dca692", "name": "tempest-TestNetworkBasicOps-server-3559488", "status": "ACTIVE", "tenant_id": "829310cd8381494e96216dba067ff8d3", "user_id": "dddc34b0385a49a5bd9bf081ed29e9fd", "metadata": {}, "hostId": "0c6cfdf0627941602de15b61ce73eab761f53a1d9b2a5d92c8bbcc8e", "image": {"id": "9f613975-b701-42a0-9b35-7d5c4a2cb7f2", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/images/9f613975-b701-42a0-9b35-7d5c4a2cb7f2"}]}, "flavor": {"id": "9956992e-a3ca-497f-9747-3ae270e07def", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/9956992e-a3ca-497f-9747-3ae270e07def"}]}, "created": "2026-01-31T06:39:06Z", "updated": "2026-01-31T06:39:29Z", "addresses": {"tempest-network-smoke--720963365": [{"version": 4, "addr": "10.100.0.14", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:bf:6a:c9"}, {"version": 4, "addr": "192.168.122.222", "OS-EXT-IPS:type": "floating", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:bf:6a:c9"}], "tempest-network-smoke--497575700": [{"version": 4, "addr": "10.100.0.23", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:26:e4:61"}]}, "accessIPv4": "", "accessIPv6": "", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/servers/8515b99e-89c2-4a03-9a14-d6a0c3dca692"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/servers/8515b99e-89c2-4a03-9a14-d6a0c3dca692"}], "OS-DCF:diskConfig": "MANUAL", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "True", "key_name": "tempest-TestNetworkBasicOps-1595486681", "OS-SRV-USG:launched_at": "2026-01-31T06:39:29.000000", "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "default"}, {"name": "tempest-secgroup-smoke-1605993175"}], "OS-EXT-SRV-ATTR:host": "compute-0.ctlplane.example.com", "OS-EXT-SRV-ATTR:instance_name": "instance-00000006", "OS-EXT-SRV-ATTR:hypervisor_hostname": "compute-0.ctlplane.example.com", "OS-EXT-STS:task_state": null, "OS-EXT-STS:vm_state": "active", "OS-EXT-STS:power_state": 1, "os-extended-volumes:volumes_attached": []}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:656
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.682 16 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/servers/8515b99e-89c2-4a03-9a14-d6a0c3dca692 used request id req-f68c8953-9e53-48fd-bb45-c9b8f300c7ce request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:1081
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.682 16 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '8515b99e-89c2-4a03-9a14-d6a0c3dca692', 'name': 'tempest-TestNetworkBasicOps-server-3559488', 'flavor': {'id': '9956992e-a3ca-497f-9747-3ae270e07def', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9f613975-b701-42a0-9b35-7d5c4a2cb7f2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000006', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '829310cd8381494e96216dba067ff8d3', 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'hostId': '0c6cfdf0627941602de15b61ce73eab761f53a1d9b2a5d92c8bbcc8e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:226
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.683 16 INFO ceilometer.polling.manager [-] Polling pollster disk.root.size in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.683 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43bbb0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.683 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43bbb0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.684 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.root.size heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.684 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.root.size (2026-01-31T06:40:19.684047) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.685 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.root.size in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.685 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f199f44d6a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.685 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.686 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d850>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.686 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d850>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.686 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.rate heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.686 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.rate (2026-01-31T06:40:19.686392) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.686 16 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.687 16 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-3559488>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-3559488>]
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.687 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f199f44d3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.687 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.687 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d5e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.688 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d5e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.688 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.688 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes (2026-01-31T06:40:19.688236) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.692 16 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 8515b99e-89c2-4a03-9a14-d6a0c3dca692 / tapdf57296c-8a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.692 16 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 8515b99e-89c2-4a03-9a14-d6a0c3dca692 / tap33eb9351-99 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.693 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/network.incoming.bytes volume: 10520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.693 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/network.incoming.bytes volume: 1330 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.693 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.693 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f199f44d760>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.694 16 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.694 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d400>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.694 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d400>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.694 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for memory.usage (2026-01-31T06:40:19.694449) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.694 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: memory.usage heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.710 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/memory.usage volume: 43.47265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.711 16 INFO ceilometer.polling.manager [-] Finished polling pollster memory.usage in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.711 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f199f44d4f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.712 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.712 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d580>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.712 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d580>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.712 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.error heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.712 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.error (2026-01-31T06:40:19.712357) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.712 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.713 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.713 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.error in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.713 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f199f43b550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.713 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.713 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b5e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.714 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b5e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.714 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.714 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.bytes (2026-01-31T06:40:19.714213) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.742 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.device.write.bytes volume: 72970240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.743 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.743 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.bytes in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.744 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f199f43b0d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.744 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.744 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b6d0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.744 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b6d0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.744 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.latency (2026-01-31T06:40:19.744679) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.744 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.latency heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.745 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.device.read.latency volume: 2458653850 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.745 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.device.read.latency volume: 69953710 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.746 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.latency in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.746 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f199f44d2e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.746 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.746 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d3a0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.746 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d3a0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.746 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.drop (2026-01-31T06:40:19.746746) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.746 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.drop heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.747 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.747 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.747 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.748 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f199f44d940>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.748 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.748 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d1f0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.748 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d1f0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.748 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.delta heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.748 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.749 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.749 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.delta (2026-01-31T06:40:19.748662) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.749 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.749 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f19a53f3b80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.750 16 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.750 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f436ee0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.750 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f436ee0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.750 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: cpu heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.750 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for cpu (2026-01-31T06:40:19.750443) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.750 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/cpu volume: 12520000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.751 16 INFO ceilometer.polling.manager [-] Finished polling pollster cpu in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.751 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f199f44d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.751 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.751 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d7f0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.751 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d7f0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.751 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.752 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets (2026-01-31T06:40:19.751867) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.752 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/network.outgoing.packets volume: 58 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.752 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/network.outgoing.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.753 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.753 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f199f44db50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.753 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.753 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44dbe0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.753 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44dbe0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.753 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.delta heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.754 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.delta (2026-01-31T06:40:19.753935) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.754 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.754 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.755 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.755 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f199f43b700>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.755 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.755 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b610>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.755 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b610>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.755 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.756 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.bytes (2026-01-31T06:40:19.755890) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.756 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.device.read.bytes volume: 30513664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.756 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.device.read.bytes volume: 284990 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.757 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.bytes in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.757 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f199f44d040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.757 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.757 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44de20>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.757 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44de20>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.758 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.drop heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.758 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.drop (2026-01-31T06:40:19.758153) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.758 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.758 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.759 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.drop in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.759 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f199f44dcd0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.759 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.759 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44dd60>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.760 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44dd60>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.760 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.760 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets (2026-01-31T06:40:19.760113) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.760 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/network.incoming.packets volume: 57 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.760 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.761 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.761 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f199f43b3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.761 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.761 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b460>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.761 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b460>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.762 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.requests heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.762 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.requests (2026-01-31T06:40:19.762001) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.762 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.device.read.requests volume: 1098 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.762 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.device.read.requests volume: 113 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.763 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.requests in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.764 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f199f43b340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.764 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.764 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b9d0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.764 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b9d0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.764 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.capacity heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.765 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.capacity (2026-01-31T06:40:19.764779) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.781 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.781 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.device.capacity volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.782 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.capacity in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.782 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f199f43bdf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.783 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.783 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43be80>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.783 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43be80>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.783 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.latency heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.784 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.device.write.latency volume: 16016715093 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.783 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.latency (2026-01-31T06:40:19.783549) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.784 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.785 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.latency in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.785 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f199f43bbe0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.785 16 INFO ceilometer.polling.manager [-] Polling pollster disk.ephemeral.size in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.785 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b2e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.785 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b2e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.786 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.ephemeral.size heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.786 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.ephemeral.size (2026-01-31T06:40:19.785966) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.786 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.ephemeral.size in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.786 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f199f43b490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.787 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.787 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b520>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.787 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b520>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.787 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.usage heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.787 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.usage (2026-01-31T06:40:19.787559) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.787 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.788 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.device.usage volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.788 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.usage in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.789 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f199f436bb0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.789 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.789 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43ba60>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.789 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43ba60>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.789 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.allocation heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.790 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.allocation (2026-01-31T06:40:19.789824) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.790 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.790 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.device.allocation volume: 499712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.791 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.allocation in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.791 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f199f451250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.791 16 INFO ceilometer.polling.manager [-] Polling pollster power.state in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.791 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f4512e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.792 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f4512e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.792 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: power.state heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.792 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for power.state (2026-01-31T06:40:19.792161) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.792 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/power.state volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.793 16 INFO ceilometer.polling.manager [-] Finished polling pollster power.state in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.793 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f199f44dc10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.793 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.793 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44dca0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.793 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44dca0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.793 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.rate heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.793 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.rate (2026-01-31T06:40:19.793747) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.794 16 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.794 16 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-3559488>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-3559488>]
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.794 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f199f43baf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.794 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.794 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b400>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.794 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b400>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.795 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.requests heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.795 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.requests (2026-01-31T06:40:19.795120) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.795 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.device.write.requests volume: 307 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.795 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.796 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.requests in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.796 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f199f44d160>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.796 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.796 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d0a0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.796 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d0a0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.797 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.error heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.797 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.error (2026-01-31T06:40:19.797124) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.797 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.797 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.798 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.error in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.798 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f199f44d220>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.798 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.799 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d130>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.799 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d130>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.799 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.799 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes (2026-01-31T06:40:19.799348) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.799 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/network.outgoing.bytes volume: 8398 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.800 16 DEBUG ceilometer.compute.pollsters [-] 8515b99e-89c2-4a03-9a14-d6a0c3dca692/network.outgoing.bytes volume: 1284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:40:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:40:19.800 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes in the context of pollsters
Jan 31 01:40:19 np0005603500 nova_compute[182934]: 2026-01-31 06:40:19.954 182938 DEBUG nova.compute.manager [req-75768598-40ec-4c80-b96f-57b06967454c req-98f16592-390d-4e64-af2c-5aee2b81a402 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received event network-vif-plugged-33eb9351-9906-4872-b960-8ca2037338a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:40:19 np0005603500 nova_compute[182934]: 2026-01-31 06:40:19.955 182938 DEBUG oslo_concurrency.lockutils [req-75768598-40ec-4c80-b96f-57b06967454c req-98f16592-390d-4e64-af2c-5aee2b81a402 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:40:19 np0005603500 nova_compute[182934]: 2026-01-31 06:40:19.955 182938 DEBUG oslo_concurrency.lockutils [req-75768598-40ec-4c80-b96f-57b06967454c req-98f16592-390d-4e64-af2c-5aee2b81a402 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:40:19 np0005603500 nova_compute[182934]: 2026-01-31 06:40:19.956 182938 DEBUG oslo_concurrency.lockutils [req-75768598-40ec-4c80-b96f-57b06967454c req-98f16592-390d-4e64-af2c-5aee2b81a402 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:40:19 np0005603500 nova_compute[182934]: 2026-01-31 06:40:19.956 182938 DEBUG nova.compute.manager [req-75768598-40ec-4c80-b96f-57b06967454c req-98f16592-390d-4e64-af2c-5aee2b81a402 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] No waiting events found dispatching network-vif-plugged-33eb9351-9906-4872-b960-8ca2037338a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:40:19 np0005603500 nova_compute[182934]: 2026-01-31 06:40:19.956 182938 WARNING nova.compute.manager [req-75768598-40ec-4c80-b96f-57b06967454c req-98f16592-390d-4e64-af2c-5aee2b81a402 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received unexpected event network-vif-plugged-33eb9351-9906-4872-b960-8ca2037338a5 for instance with vm_state active and task_state None.
Jan 31 01:40:19 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:19.978 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:40:20 np0005603500 nova_compute[182934]: 2026-01-31 06:40:20.881 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:23 np0005603500 nova_compute[182934]: 2026-01-31 06:40:23.622 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:25 np0005603500 podman[215689]: 2026-01-31 06:40:25.133271527 +0000 UTC m=+0.049354892 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:40:25 np0005603500 nova_compute[182934]: 2026-01-31 06:40:25.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:40:25 np0005603500 podman[215688]: 2026-01-31 06:40:25.153189921 +0000 UTC m=+0.072631183 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 31 01:40:25 np0005603500 nova_compute[182934]: 2026-01-31 06:40:25.883 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:28 np0005603500 nova_compute[182934]: 2026-01-31 06:40:28.625 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:30 np0005603500 nova_compute[182934]: 2026-01-31 06:40:30.885 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:32 np0005603500 nova_compute[182934]: 2026-01-31 06:40:32.679 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:40:33 np0005603500 podman[215725]: 2026-01-31 06:40:33.141955629 +0000 UTC m=+0.056458599 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, release=1769056855, vendor=Red Hat, Inc., config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 31 01:40:33 np0005603500 nova_compute[182934]: 2026-01-31 06:40:33.630 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:34 np0005603500 nova_compute[182934]: 2026-01-31 06:40:34.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:40:34 np0005603500 podman[215747]: 2026-01-31 06:40:34.160300557 +0000 UTC m=+0.075946060 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:40:35 np0005603500 nova_compute[182934]: 2026-01-31 06:40:35.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:40:35 np0005603500 nova_compute[182934]: 2026-01-31 06:40:35.147 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:40:35 np0005603500 nova_compute[182934]: 2026-01-31 06:40:35.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:40:35 np0005603500 nova_compute[182934]: 2026-01-31 06:40:35.663 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:40:35 np0005603500 nova_compute[182934]: 2026-01-31 06:40:35.663 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:40:35 np0005603500 nova_compute[182934]: 2026-01-31 06:40:35.663 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:40:35 np0005603500 nova_compute[182934]: 2026-01-31 06:40:35.664 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:40:35 np0005603500 nova_compute[182934]: 2026-01-31 06:40:35.909 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:36 np0005603500 nova_compute[182934]: 2026-01-31 06:40:36.700 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:40:36 np0005603500 nova_compute[182934]: 2026-01-31 06:40:36.753 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:40:36 np0005603500 nova_compute[182934]: 2026-01-31 06:40:36.754 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:40:36 np0005603500 nova_compute[182934]: 2026-01-31 06:40:36.799 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:40:36 np0005603500 nova_compute[182934]: 2026-01-31 06:40:36.946 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:40:36 np0005603500 nova_compute[182934]: 2026-01-31 06:40:36.948 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5583MB free_disk=73.18342971801758GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:40:36 np0005603500 nova_compute[182934]: 2026-01-31 06:40:36.949 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:40:36 np0005603500 nova_compute[182934]: 2026-01-31 06:40:36.949 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:40:37 np0005603500 nova_compute[182934]: 2026-01-31 06:40:37.481 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "011d72bb-c539-4b1c-bd71-14adf765cc17" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:40:37 np0005603500 nova_compute[182934]: 2026-01-31 06:40:37.482 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "011d72bb-c539-4b1c-bd71-14adf765cc17" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:40:37 np0005603500 nova_compute[182934]: 2026-01-31 06:40:37.987 182938 DEBUG nova.compute.manager [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Jan 31 01:40:38 np0005603500 nova_compute[182934]: 2026-01-31 06:40:38.004 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Instance 8515b99e-89c2-4a03-9a14-d6a0c3dca692 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Jan 31 01:40:38 np0005603500 nova_compute[182934]: 2026-01-31 06:40:38.511 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Instance 011d72bb-c539-4b1c-bd71-14adf765cc17 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1794
Jan 31 01:40:38 np0005603500 nova_compute[182934]: 2026-01-31 06:40:38.511 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:40:38 np0005603500 nova_compute[182934]: 2026-01-31 06:40:38.511 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:40:38 np0005603500 nova_compute[182934]: 2026-01-31 06:40:38.516 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:40:38 np0005603500 nova_compute[182934]: 2026-01-31 06:40:38.568 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:40:38 np0005603500 nova_compute[182934]: 2026-01-31 06:40:38.631 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:39 np0005603500 nova_compute[182934]: 2026-01-31 06:40:39.075 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:40:39 np0005603500 nova_compute[182934]: 2026-01-31 06:40:39.077 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:40:39 np0005603500 nova_compute[182934]: 2026-01-31 06:40:39.078 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:40:39 np0005603500 nova_compute[182934]: 2026-01-31 06:40:39.078 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.562s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:40:39 np0005603500 nova_compute[182934]: 2026-01-31 06:40:39.196 182938 DEBUG nova.virt.hardware [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Jan 31 01:40:39 np0005603500 nova_compute[182934]: 2026-01-31 06:40:39.196 182938 INFO nova.compute.claims [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Claim successful on node compute-0.ctlplane.example.com
Jan 31 01:40:40 np0005603500 nova_compute[182934]: 2026-01-31 06:40:40.079 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:40:40 np0005603500 nova_compute[182934]: 2026-01-31 06:40:40.079 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:40:40 np0005603500 nova_compute[182934]: 2026-01-31 06:40:40.079 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:40:40 np0005603500 nova_compute[182934]: 2026-01-31 06:40:40.269 182938 DEBUG nova.compute.provider_tree [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:40:40 np0005603500 nova_compute[182934]: 2026-01-31 06:40:40.776 182938 DEBUG nova.scheduler.client.report [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:40:40 np0005603500 nova_compute[182934]: 2026-01-31 06:40:40.913 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:41 np0005603500 podman[215780]: 2026-01-31 06:40:41.14870775 +0000 UTC m=+0.063609488 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 01:40:41 np0005603500 nova_compute[182934]: 2026-01-31 06:40:41.288 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:40:41 np0005603500 nova_compute[182934]: 2026-01-31 06:40:41.288 182938 DEBUG nova.compute.manager [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Jan 31 01:40:41 np0005603500 nova_compute[182934]: 2026-01-31 06:40:41.799 182938 DEBUG nova.compute.manager [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Jan 31 01:40:41 np0005603500 nova_compute[182934]: 2026-01-31 06:40:41.799 182938 DEBUG nova.network.neutron [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Jan 31 01:40:42 np0005603500 nova_compute[182934]: 2026-01-31 06:40:42.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:40:42 np0005603500 nova_compute[182934]: 2026-01-31 06:40:42.147 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11834
Jan 31 01:40:42 np0005603500 nova_compute[182934]: 2026-01-31 06:40:42.308 182938 INFO nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 31 01:40:42 np0005603500 nova_compute[182934]: 2026-01-31 06:40:42.665 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11843
Jan 31 01:40:42 np0005603500 nova_compute[182934]: 2026-01-31 06:40:42.792 182938 DEBUG nova.policy [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '829310cd8381494e96216dba067ff8d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Jan 31 01:40:42 np0005603500 nova_compute[182934]: 2026-01-31 06:40:42.849 182938 DEBUG nova.compute.manager [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Jan 31 01:40:43 np0005603500 podman[215800]: 2026-01-31 06:40:43.13253521 +0000 UTC m=+0.048770264 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:40:43 np0005603500 nova_compute[182934]: 2026-01-31 06:40:43.633 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:43 np0005603500 nova_compute[182934]: 2026-01-31 06:40:43.665 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:40:43 np0005603500 nova_compute[182934]: 2026-01-31 06:40:43.872 182938 DEBUG nova.compute.manager [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Jan 31 01:40:43 np0005603500 nova_compute[182934]: 2026-01-31 06:40:43.873 182938 DEBUG nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Jan 31 01:40:43 np0005603500 nova_compute[182934]: 2026-01-31 06:40:43.873 182938 INFO nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Creating image(s)
Jan 31 01:40:43 np0005603500 nova_compute[182934]: 2026-01-31 06:40:43.874 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "/var/lib/nova/instances/011d72bb-c539-4b1c-bd71-14adf765cc17/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:40:43 np0005603500 nova_compute[182934]: 2026-01-31 06:40:43.874 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/011d72bb-c539-4b1c-bd71-14adf765cc17/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:40:43 np0005603500 nova_compute[182934]: 2026-01-31 06:40:43.874 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/011d72bb-c539-4b1c-bd71-14adf765cc17/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:40:43 np0005603500 nova_compute[182934]: 2026-01-31 06:40:43.875 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:40:43 np0005603500 nova_compute[182934]: 2026-01-31 06:40:43.879 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:40:43 np0005603500 nova_compute[182934]: 2026-01-31 06:40:43.880 182938 DEBUG oslo_concurrency.processutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:40:43 np0005603500 nova_compute[182934]: 2026-01-31 06:40:43.927 182938 DEBUG oslo_concurrency.processutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:40:43 np0005603500 nova_compute[182934]: 2026-01-31 06:40:43.929 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "d9035e96dc857b84194c2a2b496d294827e2de39" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:40:43 np0005603500 nova_compute[182934]: 2026-01-31 06:40:43.929 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:40:43 np0005603500 nova_compute[182934]: 2026-01-31 06:40:43.930 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:40:43 np0005603500 nova_compute[182934]: 2026-01-31 06:40:43.934 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:40:43 np0005603500 nova_compute[182934]: 2026-01-31 06:40:43.934 182938 DEBUG oslo_concurrency.processutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:40:43 np0005603500 nova_compute[182934]: 2026-01-31 06:40:43.973 182938 DEBUG oslo_concurrency.processutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:40:43 np0005603500 nova_compute[182934]: 2026-01-31 06:40:43.974 182938 DEBUG oslo_concurrency.processutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/011d72bb-c539-4b1c-bd71-14adf765cc17/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:40:44 np0005603500 nova_compute[182934]: 2026-01-31 06:40:44.324 182938 DEBUG oslo_concurrency.processutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/011d72bb-c539-4b1c-bd71-14adf765cc17/disk 1073741824" returned: 0 in 0.349s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:40:44 np0005603500 nova_compute[182934]: 2026-01-31 06:40:44.325 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.396s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:40:44 np0005603500 nova_compute[182934]: 2026-01-31 06:40:44.326 182938 DEBUG oslo_concurrency.processutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:40:44 np0005603500 nova_compute[182934]: 2026-01-31 06:40:44.391 182938 DEBUG oslo_concurrency.processutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:40:44 np0005603500 nova_compute[182934]: 2026-01-31 06:40:44.391 182938 DEBUG nova.virt.disk.api [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Checking if we can resize image /var/lib/nova/instances/011d72bb-c539-4b1c-bd71-14adf765cc17/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Jan 31 01:40:44 np0005603500 nova_compute[182934]: 2026-01-31 06:40:44.392 182938 DEBUG oslo_concurrency.processutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/011d72bb-c539-4b1c-bd71-14adf765cc17/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:40:44 np0005603500 nova_compute[182934]: 2026-01-31 06:40:44.434 182938 DEBUG oslo_concurrency.processutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/011d72bb-c539-4b1c-bd71-14adf765cc17/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:40:44 np0005603500 nova_compute[182934]: 2026-01-31 06:40:44.435 182938 DEBUG nova.virt.disk.api [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Cannot resize image /var/lib/nova/instances/011d72bb-c539-4b1c-bd71-14adf765cc17/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Jan 31 01:40:44 np0005603500 nova_compute[182934]: 2026-01-31 06:40:44.435 182938 DEBUG nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Jan 31 01:40:44 np0005603500 nova_compute[182934]: 2026-01-31 06:40:44.436 182938 DEBUG nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Ensure instance console log exists: /var/lib/nova/instances/011d72bb-c539-4b1c-bd71-14adf765cc17/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Jan 31 01:40:44 np0005603500 nova_compute[182934]: 2026-01-31 06:40:44.436 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:40:44 np0005603500 nova_compute[182934]: 2026-01-31 06:40:44.436 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:40:44 np0005603500 nova_compute[182934]: 2026-01-31 06:40:44.436 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:40:44 np0005603500 nova_compute[182934]: 2026-01-31 06:40:44.764 182938 DEBUG nova.network.neutron [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Successfully created port: ddeb4802-5c13-4b7c-9ab2-03de55b010c4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 31 01:40:45 np0005603500 nova_compute[182934]: 2026-01-31 06:40:45.915 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:46 np0005603500 nova_compute[182934]: 2026-01-31 06:40:46.102 182938 DEBUG nova.network.neutron [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Successfully updated port: ddeb4802-5c13-4b7c-9ab2-03de55b010c4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 31 01:40:46 np0005603500 nova_compute[182934]: 2026-01-31 06:40:46.364 182938 DEBUG nova.compute.manager [req-581ceef8-7e3a-4ff6-bc0f-21293dda2516 req-fc9488cc-d782-4c72-a66e-9d2efb8ecbe6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Received event network-changed-ddeb4802-5c13-4b7c-9ab2-03de55b010c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:40:46 np0005603500 nova_compute[182934]: 2026-01-31 06:40:46.364 182938 DEBUG nova.compute.manager [req-581ceef8-7e3a-4ff6-bc0f-21293dda2516 req-fc9488cc-d782-4c72-a66e-9d2efb8ecbe6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Refreshing instance network info cache due to event network-changed-ddeb4802-5c13-4b7c-9ab2-03de55b010c4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:40:46 np0005603500 nova_compute[182934]: 2026-01-31 06:40:46.364 182938 DEBUG oslo_concurrency.lockutils [req-581ceef8-7e3a-4ff6-bc0f-21293dda2516 req-fc9488cc-d782-4c72-a66e-9d2efb8ecbe6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-011d72bb-c539-4b1c-bd71-14adf765cc17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:40:46 np0005603500 nova_compute[182934]: 2026-01-31 06:40:46.365 182938 DEBUG oslo_concurrency.lockutils [req-581ceef8-7e3a-4ff6-bc0f-21293dda2516 req-fc9488cc-d782-4c72-a66e-9d2efb8ecbe6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-011d72bb-c539-4b1c-bd71-14adf765cc17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:40:46 np0005603500 nova_compute[182934]: 2026-01-31 06:40:46.365 182938 DEBUG nova.network.neutron [req-581ceef8-7e3a-4ff6-bc0f-21293dda2516 req-fc9488cc-d782-4c72-a66e-9d2efb8ecbe6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Refreshing network info cache for port ddeb4802-5c13-4b7c-9ab2-03de55b010c4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:40:46 np0005603500 nova_compute[182934]: 2026-01-31 06:40:46.622 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "refresh_cache-011d72bb-c539-4b1c-bd71-14adf765cc17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:40:47 np0005603500 nova_compute[182934]: 2026-01-31 06:40:47.293 182938 DEBUG nova.network.neutron [req-581ceef8-7e3a-4ff6-bc0f-21293dda2516 req-fc9488cc-d782-4c72-a66e-9d2efb8ecbe6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:40:48 np0005603500 nova_compute[182934]: 2026-01-31 06:40:48.117 182938 DEBUG nova.network.neutron [req-581ceef8-7e3a-4ff6-bc0f-21293dda2516 req-fc9488cc-d782-4c72-a66e-9d2efb8ecbe6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:40:48 np0005603500 nova_compute[182934]: 2026-01-31 06:40:48.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:40:48 np0005603500 nova_compute[182934]: 2026-01-31 06:40:48.147 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11872
Jan 31 01:40:48 np0005603500 nova_compute[182934]: 2026-01-31 06:40:48.634 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:48 np0005603500 nova_compute[182934]: 2026-01-31 06:40:48.680 182938 DEBUG oslo_concurrency.lockutils [req-581ceef8-7e3a-4ff6-bc0f-21293dda2516 req-fc9488cc-d782-4c72-a66e-9d2efb8ecbe6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-011d72bb-c539-4b1c-bd71-14adf765cc17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:40:48 np0005603500 nova_compute[182934]: 2026-01-31 06:40:48.681 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquired lock "refresh_cache-011d72bb-c539-4b1c-bd71-14adf765cc17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:40:48 np0005603500 nova_compute[182934]: 2026-01-31 06:40:48.681 182938 DEBUG nova.network.neutron [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Jan 31 01:40:49 np0005603500 nova_compute[182934]: 2026-01-31 06:40:49.751 182938 DEBUG nova.network.neutron [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:40:50 np0005603500 nova_compute[182934]: 2026-01-31 06:40:50.919 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:52 np0005603500 nova_compute[182934]: 2026-01-31 06:40:52.804 182938 DEBUG nova.network.neutron [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Updating instance_info_cache with network_info: [{"id": "ddeb4802-5c13-4b7c-9ab2-03de55b010c4", "address": "fa:16:3e:eb:d4:cd", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddeb4802-5c", "ovs_interfaceid": "ddeb4802-5c13-4b7c-9ab2-03de55b010c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.313 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Releasing lock "refresh_cache-011d72bb-c539-4b1c-bd71-14adf765cc17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.314 182938 DEBUG nova.compute.manager [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Instance network_info: |[{"id": "ddeb4802-5c13-4b7c-9ab2-03de55b010c4", "address": "fa:16:3e:eb:d4:cd", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddeb4802-5c", "ovs_interfaceid": "ddeb4802-5c13-4b7c-9ab2-03de55b010c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.316 182938 DEBUG nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Start _get_guest_xml network_info=[{"id": "ddeb4802-5c13-4b7c-9ab2-03de55b010c4", "address": "fa:16:3e:eb:d4:cd", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddeb4802-5c", "ovs_interfaceid": "ddeb4802-5c13-4b7c-9ab2-03de55b010c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': '9f613975-b701-42a0-9b35-7d5c4a2cb7f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.319 182938 WARNING nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.320 182938 DEBUG nova.virt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-1010426789', uuid='011d72bb-c539-4b1c-bd71-14adf765cc17'), owner=OwnerMeta(userid='dddc34b0385a49a5bd9bf081ed29e9fd', username='tempest-TestNetworkBasicOps-1355800406-project-member', projectid='829310cd8381494e96216dba067ff8d3', projectname='tempest-TestNetworkBasicOps-1355800406'), image=ImageMeta(id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "ddeb4802-5c13-4b7c-9ab2-03de55b010c4", "address": "fa:16:3e:eb:d4:cd", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddeb4802-5c", "ovs_interfaceid": "ddeb4802-5c13-4b7c-9ab2-03de55b010c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1769841653.3203886) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.331 182938 DEBUG nova.virt.libvirt.host [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.332 182938 DEBUG nova.virt.libvirt.host [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.338 182938 DEBUG nova.virt.libvirt.host [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.338 182938 DEBUG nova.virt.libvirt.host [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.339 182938 DEBUG nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.339 182938 DEBUG nova.virt.hardware [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T06:29:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9956992e-a3ca-497f-9747-3ae270e07def',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.340 182938 DEBUG nova.virt.hardware [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.340 182938 DEBUG nova.virt.hardware [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.340 182938 DEBUG nova.virt.hardware [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.340 182938 DEBUG nova.virt.hardware [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.340 182938 DEBUG nova.virt.hardware [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.341 182938 DEBUG nova.virt.hardware [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.341 182938 DEBUG nova.virt.hardware [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.341 182938 DEBUG nova.virt.hardware [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.341 182938 DEBUG nova.virt.hardware [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.341 182938 DEBUG nova.virt.hardware [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.344 182938 DEBUG nova.virt.libvirt.vif [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:40:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1010426789',display_name='tempest-TestNetworkBasicOps-server-1010426789',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1010426789',id=7,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA9hHTG0FgYJ4RDEfgLrvydcCLPpfNl926aCzkTSOPoigr7OR+EUaQrLzXRkEO6rzE65gx6hfCyIap+rU59sbLZkWobmbXEa5QfZeMW0CQGemHRakSUoqBbt+m0ZJ6Jidw==',key_name='tempest-TestNetworkBasicOps-753337127',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-k8dt423k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:40:42Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=011d72bb-c539-4b1c-bd71-14adf765cc17,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ddeb4802-5c13-4b7c-9ab2-03de55b010c4", "address": "fa:16:3e:eb:d4:cd", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddeb4802-5c", "ovs_interfaceid": "ddeb4802-5c13-4b7c-9ab2-03de55b010c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.345 182938 DEBUG nova.network.os_vif_util [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "ddeb4802-5c13-4b7c-9ab2-03de55b010c4", "address": "fa:16:3e:eb:d4:cd", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddeb4802-5c", "ovs_interfaceid": "ddeb4802-5c13-4b7c-9ab2-03de55b010c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.345 182938 DEBUG nova.network.os_vif_util [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:d4:cd,bridge_name='br-int',has_traffic_filtering=True,id=ddeb4802-5c13-4b7c-9ab2-03de55b010c4,network=Network(8e0e18fe-0b80-4494-a6db-546b6daf5fd2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddeb4802-5c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.346 182938 DEBUG nova.objects.instance [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 011d72bb-c539-4b1c-bd71-14adf765cc17 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.696 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.855 182938 DEBUG nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] End _get_guest_xml xml=<domain type="kvm">
Jan 31 01:40:53 np0005603500 nova_compute[182934]:  <uuid>011d72bb-c539-4b1c-bd71-14adf765cc17</uuid>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:  <name>instance-00000007</name>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:  <memory>131072</memory>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:  <vcpu>1</vcpu>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <nova:name>tempest-TestNetworkBasicOps-server-1010426789</nova:name>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <nova:creationTime>2026-01-31 06:40:53</nova:creationTime>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <nova:flavor name="m1.nano">
Jan 31 01:40:53 np0005603500 nova_compute[182934]:        <nova:memory>128</nova:memory>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:        <nova:disk>1</nova:disk>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:        <nova:swap>0</nova:swap>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:        <nova:vcpus>1</nova:vcpus>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      </nova:flavor>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <nova:owner>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:        <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:        <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      </nova:owner>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <nova:ports>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:        <nova:port uuid="ddeb4802-5c13-4b7c-9ab2-03de55b010c4">
Jan 31 01:40:53 np0005603500 nova_compute[182934]:          <nova:ip type="fixed" address="10.100.0.19" ipVersion="4"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:        </nova:port>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      </nova:ports>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    </nova:instance>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:  <sysinfo type="smbios">
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <entry name="manufacturer">RDO</entry>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <entry name="product">OpenStack Compute</entry>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <entry name="serial">011d72bb-c539-4b1c-bd71-14adf765cc17</entry>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <entry name="uuid">011d72bb-c539-4b1c-bd71-14adf765cc17</entry>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <entry name="family">Virtual Machine</entry>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <boot dev="hd"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <smbios mode="sysinfo"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <vmcoreinfo/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:  <clock offset="utc">
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <timer name="hpet" present="no"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:  <cpu mode="host-model" match="exact">
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <disk type="file" device="disk">
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/011d72bb-c539-4b1c-bd71-14adf765cc17/disk"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <target dev="vda" bus="virtio"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <disk type="file" device="cdrom">
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <driver name="qemu" type="raw" cache="none"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/011d72bb-c539-4b1c-bd71-14adf765cc17/disk.config"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <target dev="sda" bus="sata"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <interface type="ethernet">
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <mac address="fa:16:3e:eb:d4:cd"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <mtu size="1442"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <target dev="tapddeb4802-5c"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <serial type="pty">
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <log file="/var/lib/nova/instances/011d72bb-c539-4b1c-bd71-14adf765cc17/console.log" append="off"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <input type="tablet" bus="usb"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <rng model="virtio">
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <backend model="random">/dev/urandom</backend>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <controller type="usb" index="0"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    <memballoon model="virtio">
Jan 31 01:40:53 np0005603500 nova_compute[182934]:      <stats period="10"/>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:40:53 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:40:53 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:40:53 np0005603500 nova_compute[182934]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.856 182938 DEBUG nova.compute.manager [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Preparing to wait for external event network-vif-plugged-ddeb4802-5c13-4b7c-9ab2-03de55b010c4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.856 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "011d72bb-c539-4b1c-bd71-14adf765cc17-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.856 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "011d72bb-c539-4b1c-bd71-14adf765cc17-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.857 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "011d72bb-c539-4b1c-bd71-14adf765cc17-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.858 182938 DEBUG nova.virt.libvirt.vif [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:40:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1010426789',display_name='tempest-TestNetworkBasicOps-server-1010426789',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1010426789',id=7,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA9hHTG0FgYJ4RDEfgLrvydcCLPpfNl926aCzkTSOPoigr7OR+EUaQrLzXRkEO6rzE65gx6hfCyIap+rU59sbLZkWobmbXEa5QfZeMW0CQGemHRakSUoqBbt+m0ZJ6Jidw==',key_name='tempest-TestNetworkBasicOps-753337127',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-k8dt423k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:40:42Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=011d72bb-c539-4b1c-bd71-14adf765cc17,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ddeb4802-5c13-4b7c-9ab2-03de55b010c4", "address": "fa:16:3e:eb:d4:cd", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddeb4802-5c", "ovs_interfaceid": "ddeb4802-5c13-4b7c-9ab2-03de55b010c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.858 182938 DEBUG nova.network.os_vif_util [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "ddeb4802-5c13-4b7c-9ab2-03de55b010c4", "address": "fa:16:3e:eb:d4:cd", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddeb4802-5c", "ovs_interfaceid": "ddeb4802-5c13-4b7c-9ab2-03de55b010c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.859 182938 DEBUG nova.network.os_vif_util [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:d4:cd,bridge_name='br-int',has_traffic_filtering=True,id=ddeb4802-5c13-4b7c-9ab2-03de55b010c4,network=Network(8e0e18fe-0b80-4494-a6db-546b6daf5fd2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddeb4802-5c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.859 182938 DEBUG os_vif [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:d4:cd,bridge_name='br-int',has_traffic_filtering=True,id=ddeb4802-5c13-4b7c-9ab2-03de55b010c4,network=Network(8e0e18fe-0b80-4494-a6db-546b6daf5fd2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddeb4802-5c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.860 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.860 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.861 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.862 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.862 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '26765daa-4768-5ef4-9964-71af7b5045d0', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.865 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.869 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.870 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapddeb4802-5c, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.870 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapddeb4802-5c, col_values=(('qos', UUID('0d977891-63d3-4eaa-a9b9-d25810149f73')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.871 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapddeb4802-5c, col_values=(('external_ids', {'iface-id': 'ddeb4802-5c13-4b7c-9ab2-03de55b010c4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:eb:d4:cd', 'vm-uuid': '011d72bb-c539-4b1c-bd71-14adf765cc17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:40:53 np0005603500 NetworkManager[55506]: <info>  [1769841653.8731] manager: (tapddeb4802-5c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.875 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.878 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:53 np0005603500 nova_compute[182934]: 2026-01-31 06:40:53.879 182938 INFO os_vif [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:d4:cd,bridge_name='br-int',has_traffic_filtering=True,id=ddeb4802-5c13-4b7c-9ab2-03de55b010c4,network=Network(8e0e18fe-0b80-4494-a6db-546b6daf5fd2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddeb4802-5c')
Jan 31 01:40:55 np0005603500 nova_compute[182934]: 2026-01-31 06:40:55.413 182938 DEBUG nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:40:55 np0005603500 nova_compute[182934]: 2026-01-31 06:40:55.414 182938 DEBUG nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:40:55 np0005603500 nova_compute[182934]: 2026-01-31 06:40:55.414 182938 DEBUG nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No VIF found with MAC fa:16:3e:eb:d4:cd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Jan 31 01:40:55 np0005603500 nova_compute[182934]: 2026-01-31 06:40:55.415 182938 INFO nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Using config drive
Jan 31 01:40:55 np0005603500 podman[215840]: 2026-01-31 06:40:55.489823106 +0000 UTC m=+0.050848184 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 31 01:40:55 np0005603500 podman[215841]: 2026-01-31 06:40:55.498615486 +0000 UTC m=+0.055664187 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 31 01:40:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:55.967 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:40:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:55.968 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:40:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:55.969 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:40:57 np0005603500 nova_compute[182934]: 2026-01-31 06:40:57.107 182938 INFO nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Creating config drive at /var/lib/nova/instances/011d72bb-c539-4b1c-bd71-14adf765cc17/disk.config
Jan 31 01:40:57 np0005603500 nova_compute[182934]: 2026-01-31 06:40:57.111 182938 DEBUG oslo_concurrency.processutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/011d72bb-c539-4b1c-bd71-14adf765cc17/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpigp_e0rk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:40:57 np0005603500 nova_compute[182934]: 2026-01-31 06:40:57.226 182938 DEBUG oslo_concurrency.processutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/011d72bb-c539-4b1c-bd71-14adf765cc17/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpigp_e0rk" returned: 0 in 0.116s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:40:57 np0005603500 kernel: tapddeb4802-5c: entered promiscuous mode
Jan 31 01:40:57 np0005603500 NetworkManager[55506]: <info>  [1769841657.2895] manager: (tapddeb4802-5c): new Tun device (/org/freedesktop/NetworkManager/Devices/60)
Jan 31 01:40:57 np0005603500 ovn_controller[95398]: 2026-01-31T06:40:57Z|00111|binding|INFO|Claiming lport ddeb4802-5c13-4b7c-9ab2-03de55b010c4 for this chassis.
Jan 31 01:40:57 np0005603500 ovn_controller[95398]: 2026-01-31T06:40:57Z|00112|binding|INFO|ddeb4802-5c13-4b7c-9ab2-03de55b010c4: Claiming fa:16:3e:eb:d4:cd 10.100.0.19
Jan 31 01:40:57 np0005603500 nova_compute[182934]: 2026-01-31 06:40:57.292 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:57 np0005603500 ovn_controller[95398]: 2026-01-31T06:40:57Z|00113|binding|INFO|Setting lport ddeb4802-5c13-4b7c-9ab2-03de55b010c4 ovn-installed in OVS
Jan 31 01:40:57 np0005603500 nova_compute[182934]: 2026-01-31 06:40:57.302 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:57 np0005603500 ovn_controller[95398]: 2026-01-31T06:40:57Z|00114|binding|INFO|Setting lport ddeb4802-5c13-4b7c-9ab2-03de55b010c4 up in Southbound
Jan 31 01:40:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:57.306 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:d4:cd 10.100.0.19'], port_security=['fa:16:3e:eb:d4:cd 10.100.0.19'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28', 'neutron:device_id': '011d72bb-c539-4b1c-bd71-14adf765cc17', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e0e18fe-0b80-4494-a6db-546b6daf5fd2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7e9d8b57-e99d-441c-8208-450a673f23cf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07de4316-ef81-43cb-97c3-675146f1c643, chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=ddeb4802-5c13-4b7c-9ab2-03de55b010c4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:40:57 np0005603500 nova_compute[182934]: 2026-01-31 06:40:57.308 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:57.309 104644 INFO neutron.agent.ovn.metadata.agent [-] Port ddeb4802-5c13-4b7c-9ab2-03de55b010c4 in datapath 8e0e18fe-0b80-4494-a6db-546b6daf5fd2 bound to our chassis
Jan 31 01:40:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:57.310 104644 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e0e18fe-0b80-4494-a6db-546b6daf5fd2
Jan 31 01:40:57 np0005603500 systemd-udevd[215900]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:40:57 np0005603500 systemd-machined[154375]: New machine qemu-7-instance-00000007.
Jan 31 01:40:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:57.325 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ee221b-4e0a-41e6-a8d1-2f213db4e497]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:57 np0005603500 NetworkManager[55506]: <info>  [1769841657.3372] device (tapddeb4802-5c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:40:57 np0005603500 NetworkManager[55506]: <info>  [1769841657.3381] device (tapddeb4802-5c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 01:40:57 np0005603500 systemd[1]: Started Virtual Machine qemu-7-instance-00000007.
Jan 31 01:40:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:57.350 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[de671455-2299-4367-bf21-bd67e556cb1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:57.353 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[4a5d19cc-9c24-4dd9-9370-c5fff5e9cd05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:57.376 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[37b72799-df90-4553-9d26-faf1510ffbb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:57.396 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[3df2dd33-9b65-499f-967f-7c4137a9d02e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e0e18fe-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:d7:fb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 392769, 'reachable_time': 15961, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215912, 'error': None, 'target': 'ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:57.416 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[4bd71c3c-e466-4df6-a458-d38537c7d107]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tap8e0e18fe-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 392777, 'tstamp': 392777}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215913, 'error': None, 'target': 'ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8e0e18fe-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 392779, 'tstamp': 392779}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215913, 'error': None, 'target': 'ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:57.418 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e0e18fe-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:40:57 np0005603500 nova_compute[182934]: 2026-01-31 06:40:57.420 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:57 np0005603500 nova_compute[182934]: 2026-01-31 06:40:57.421 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:57.422 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e0e18fe-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:40:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:57.423 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:40:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:57.423 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e0e18fe-00, col_values=(('external_ids', {'iface-id': '38074b11-3b13-4706-9a79-123dd621d667'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:40:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:57.423 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:40:57 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:40:57.425 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[bc4ebe9c-66a4-446a-ad19-bede07bfe0bd]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-8e0e18fe-0b80-4494-a6db-546b6daf5fd2\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/8e0e18fe-0b80-4494-a6db-546b6daf5fd2.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 8e0e18fe-0b80-4494-a6db-546b6daf5fd2\n') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:40:57 np0005603500 nova_compute[182934]: 2026-01-31 06:40:57.718 182938 DEBUG nova.compute.manager [req-7ddf39af-9fa1-47e3-b31c-40b0b5da784c req-277ac563-e2ec-4924-81a0-f4fbcdcd5768 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Received event network-vif-plugged-ddeb4802-5c13-4b7c-9ab2-03de55b010c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:40:57 np0005603500 nova_compute[182934]: 2026-01-31 06:40:57.720 182938 DEBUG oslo_concurrency.lockutils [req-7ddf39af-9fa1-47e3-b31c-40b0b5da784c req-277ac563-e2ec-4924-81a0-f4fbcdcd5768 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "011d72bb-c539-4b1c-bd71-14adf765cc17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:40:57 np0005603500 nova_compute[182934]: 2026-01-31 06:40:57.720 182938 DEBUG oslo_concurrency.lockutils [req-7ddf39af-9fa1-47e3-b31c-40b0b5da784c req-277ac563-e2ec-4924-81a0-f4fbcdcd5768 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "011d72bb-c539-4b1c-bd71-14adf765cc17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:40:57 np0005603500 nova_compute[182934]: 2026-01-31 06:40:57.721 182938 DEBUG oslo_concurrency.lockutils [req-7ddf39af-9fa1-47e3-b31c-40b0b5da784c req-277ac563-e2ec-4924-81a0-f4fbcdcd5768 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "011d72bb-c539-4b1c-bd71-14adf765cc17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:40:57 np0005603500 nova_compute[182934]: 2026-01-31 06:40:57.721 182938 DEBUG nova.compute.manager [req-7ddf39af-9fa1-47e3-b31c-40b0b5da784c req-277ac563-e2ec-4924-81a0-f4fbcdcd5768 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Processing event network-vif-plugged-ddeb4802-5c13-4b7c-9ab2-03de55b010c4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Jan 31 01:40:58 np0005603500 nova_compute[182934]: 2026-01-31 06:40:58.480 182938 DEBUG nova.compute.manager [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Jan 31 01:40:58 np0005603500 nova_compute[182934]: 2026-01-31 06:40:58.483 182938 DEBUG nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Jan 31 01:40:58 np0005603500 nova_compute[182934]: 2026-01-31 06:40:58.487 182938 INFO nova.virt.libvirt.driver [-] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Instance spawned successfully.
Jan 31 01:40:58 np0005603500 nova_compute[182934]: 2026-01-31 06:40:58.487 182938 DEBUG nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Jan 31 01:40:58 np0005603500 nova_compute[182934]: 2026-01-31 06:40:58.698 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:58 np0005603500 nova_compute[182934]: 2026-01-31 06:40:58.872 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:40:59 np0005603500 nova_compute[182934]: 2026-01-31 06:40:59.004 182938 DEBUG nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:40:59 np0005603500 nova_compute[182934]: 2026-01-31 06:40:59.004 182938 DEBUG nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:40:59 np0005603500 nova_compute[182934]: 2026-01-31 06:40:59.005 182938 DEBUG nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:40:59 np0005603500 nova_compute[182934]: 2026-01-31 06:40:59.005 182938 DEBUG nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:40:59 np0005603500 nova_compute[182934]: 2026-01-31 06:40:59.006 182938 DEBUG nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:40:59 np0005603500 nova_compute[182934]: 2026-01-31 06:40:59.006 182938 DEBUG nova.virt.libvirt.driver [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:40:59 np0005603500 nova_compute[182934]: 2026-01-31 06:40:59.573 182938 INFO nova.compute.manager [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Took 15.70 seconds to spawn the instance on the hypervisor.
Jan 31 01:40:59 np0005603500 nova_compute[182934]: 2026-01-31 06:40:59.574 182938 DEBUG nova.compute.manager [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Jan 31 01:41:00 np0005603500 nova_compute[182934]: 2026-01-31 06:41:00.129 182938 DEBUG nova.compute.manager [req-8d91ed52-78d6-4a9d-aa66-e73714be97e4 req-eb1f16b8-6479-42e4-9826-5c27d02859ff 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Received event network-vif-plugged-ddeb4802-5c13-4b7c-9ab2-03de55b010c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:41:00 np0005603500 nova_compute[182934]: 2026-01-31 06:41:00.130 182938 DEBUG oslo_concurrency.lockutils [req-8d91ed52-78d6-4a9d-aa66-e73714be97e4 req-eb1f16b8-6479-42e4-9826-5c27d02859ff 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "011d72bb-c539-4b1c-bd71-14adf765cc17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:41:00 np0005603500 nova_compute[182934]: 2026-01-31 06:41:00.130 182938 DEBUG oslo_concurrency.lockutils [req-8d91ed52-78d6-4a9d-aa66-e73714be97e4 req-eb1f16b8-6479-42e4-9826-5c27d02859ff 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "011d72bb-c539-4b1c-bd71-14adf765cc17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:41:00 np0005603500 nova_compute[182934]: 2026-01-31 06:41:00.130 182938 DEBUG oslo_concurrency.lockutils [req-8d91ed52-78d6-4a9d-aa66-e73714be97e4 req-eb1f16b8-6479-42e4-9826-5c27d02859ff 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "011d72bb-c539-4b1c-bd71-14adf765cc17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:41:00 np0005603500 nova_compute[182934]: 2026-01-31 06:41:00.131 182938 DEBUG nova.compute.manager [req-8d91ed52-78d6-4a9d-aa66-e73714be97e4 req-eb1f16b8-6479-42e4-9826-5c27d02859ff 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] No waiting events found dispatching network-vif-plugged-ddeb4802-5c13-4b7c-9ab2-03de55b010c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:41:00 np0005603500 nova_compute[182934]: 2026-01-31 06:41:00.131 182938 WARNING nova.compute.manager [req-8d91ed52-78d6-4a9d-aa66-e73714be97e4 req-eb1f16b8-6479-42e4-9826-5c27d02859ff 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Received unexpected event network-vif-plugged-ddeb4802-5c13-4b7c-9ab2-03de55b010c4 for instance with vm_state active and task_state None.
Jan 31 01:41:00 np0005603500 nova_compute[182934]: 2026-01-31 06:41:00.297 182938 INFO nova.compute.manager [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Took 21.80 seconds to build instance.
Jan 31 01:41:01 np0005603500 nova_compute[182934]: 2026-01-31 06:41:01.029 182938 DEBUG oslo_concurrency.lockutils [None req-b70e5794-7ab6-48fc-8b4f-c1d7a773f322 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "011d72bb-c539-4b1c-bd71-14adf765cc17" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 23.547s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:41:03 np0005603500 nova_compute[182934]: 2026-01-31 06:41:03.701 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:03 np0005603500 nova_compute[182934]: 2026-01-31 06:41:03.874 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:04 np0005603500 podman[215922]: 2026-01-31 06:41:04.146808914 +0000 UTC m=+0.059594423 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, version=9.7, distribution-scope=public, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 31 01:41:05 np0005603500 podman[215944]: 2026-01-31 06:41:05.180450159 +0000 UTC m=+0.095344503 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ovn_controller, container_name=ovn_controller)
Jan 31 01:41:08 np0005603500 nova_compute[182934]: 2026-01-31 06:41:08.703 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:08 np0005603500 nova_compute[182934]: 2026-01-31 06:41:08.917 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:12 np0005603500 podman[215974]: 2026-01-31 06:41:12.143478509 +0000 UTC m=+0.054837770 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 01:41:13 np0005603500 nova_compute[182934]: 2026-01-31 06:41:13.705 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:13 np0005603500 nova_compute[182934]: 2026-01-31 06:41:13.919 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:14 np0005603500 podman[216003]: 2026-01-31 06:41:14.164221395 +0000 UTC m=+0.081748460 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:41:17 np0005603500 ovn_controller[95398]: 2026-01-31T06:41:17Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:eb:d4:cd 10.100.0.19
Jan 31 01:41:17 np0005603500 ovn_controller[95398]: 2026-01-31T06:41:17Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:eb:d4:cd 10.100.0.19
Jan 31 01:41:18 np0005603500 nova_compute[182934]: 2026-01-31 06:41:18.706 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:18 np0005603500 nova_compute[182934]: 2026-01-31 06:41:18.921 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:23 np0005603500 nova_compute[182934]: 2026-01-31 06:41:23.710 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:23 np0005603500 nova_compute[182934]: 2026-01-31 06:41:23.922 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:24.993 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:41:24 np0005603500 nova_compute[182934]: 2026-01-31 06:41:24.993 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:24 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:24.995 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:41:25 np0005603500 nova_compute[182934]: 2026-01-31 06:41:25.405 182938 DEBUG nova.compute.manager [req-966f72f0-189d-47e9-9afb-a13895fb8db0 req-ae9af139-fc52-49d4-ac66-ce380ed8af44 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received event network-changed-33eb9351-9906-4872-b960-8ca2037338a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:41:25 np0005603500 nova_compute[182934]: 2026-01-31 06:41:25.405 182938 DEBUG nova.compute.manager [req-966f72f0-189d-47e9-9afb-a13895fb8db0 req-ae9af139-fc52-49d4-ac66-ce380ed8af44 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Refreshing instance network info cache due to event network-changed-33eb9351-9906-4872-b960-8ca2037338a5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:41:25 np0005603500 nova_compute[182934]: 2026-01-31 06:41:25.405 182938 DEBUG oslo_concurrency.lockutils [req-966f72f0-189d-47e9-9afb-a13895fb8db0 req-ae9af139-fc52-49d4-ac66-ce380ed8af44 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:41:25 np0005603500 nova_compute[182934]: 2026-01-31 06:41:25.406 182938 DEBUG oslo_concurrency.lockutils [req-966f72f0-189d-47e9-9afb-a13895fb8db0 req-ae9af139-fc52-49d4-ac66-ce380ed8af44 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:41:25 np0005603500 nova_compute[182934]: 2026-01-31 06:41:25.406 182938 DEBUG nova.network.neutron [req-966f72f0-189d-47e9-9afb-a13895fb8db0 req-ae9af139-fc52-49d4-ac66-ce380ed8af44 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Refreshing network info cache for port 33eb9351-9906-4872-b960-8ca2037338a5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:41:26 np0005603500 podman[216033]: 2026-01-31 06:41:26.126449697 +0000 UTC m=+0.045271466 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 31 01:41:26 np0005603500 podman[216034]: 2026-01-31 06:41:26.180527182 +0000 UTC m=+0.073420663 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 01:41:28 np0005603500 nova_compute[182934]: 2026-01-31 06:41:28.712 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:28 np0005603500 nova_compute[182934]: 2026-01-31 06:41:28.926 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:30 np0005603500 nova_compute[182934]: 2026-01-31 06:41:30.014 182938 DEBUG nova.network.neutron [req-966f72f0-189d-47e9-9afb-a13895fb8db0 req-ae9af139-fc52-49d4-ac66-ce380ed8af44 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Updated VIF entry in instance network info cache for port 33eb9351-9906-4872-b960-8ca2037338a5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:41:30 np0005603500 nova_compute[182934]: 2026-01-31 06:41:30.015 182938 DEBUG nova.network.neutron [req-966f72f0-189d-47e9-9afb-a13895fb8db0 req-ae9af139-fc52-49d4-ac66-ce380ed8af44 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Updating instance_info_cache with network_info: [{"id": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "address": "fa:16:3e:bf:6a:c9", "network": {"id": "719b4b2c-8a13-4225-a6bd-071da9c7ca99", "bridge": "br-int", "label": "tempest-network-smoke--720963365", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf57296c-8a", "ovs_interfaceid": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "33eb9351-9906-4872-b960-8ca2037338a5", "address": "fa:16:3e:26:e4:61", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33eb9351-99", "ovs_interfaceid": "33eb9351-9906-4872-b960-8ca2037338a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:41:30 np0005603500 nova_compute[182934]: 2026-01-31 06:41:30.527 182938 DEBUG oslo_concurrency.lockutils [req-966f72f0-189d-47e9-9afb-a13895fb8db0 req-ae9af139-fc52-49d4-ac66-ce380ed8af44 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:41:32 np0005603500 nova_compute[182934]: 2026-01-31 06:41:32.355 182938 DEBUG oslo_concurrency.lockutils [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "011d72bb-c539-4b1c-bd71-14adf765cc17" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:41:32 np0005603500 nova_compute[182934]: 2026-01-31 06:41:32.356 182938 DEBUG oslo_concurrency.lockutils [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "011d72bb-c539-4b1c-bd71-14adf765cc17" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:41:32 np0005603500 nova_compute[182934]: 2026-01-31 06:41:32.356 182938 DEBUG oslo_concurrency.lockutils [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "011d72bb-c539-4b1c-bd71-14adf765cc17-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:41:32 np0005603500 nova_compute[182934]: 2026-01-31 06:41:32.356 182938 DEBUG oslo_concurrency.lockutils [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "011d72bb-c539-4b1c-bd71-14adf765cc17-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:41:32 np0005603500 nova_compute[182934]: 2026-01-31 06:41:32.357 182938 DEBUG oslo_concurrency.lockutils [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "011d72bb-c539-4b1c-bd71-14adf765cc17-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:41:32 np0005603500 nova_compute[182934]: 2026-01-31 06:41:32.358 182938 INFO nova.compute.manager [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Terminating instance
Jan 31 01:41:32 np0005603500 nova_compute[182934]: 2026-01-31 06:41:32.925 182938 DEBUG nova.compute.manager [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Jan 31 01:41:32 np0005603500 kernel: tapddeb4802-5c (unregistering): left promiscuous mode
Jan 31 01:41:32 np0005603500 NetworkManager[55506]: <info>  [1769841692.9503] device (tapddeb4802-5c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 01:41:32 np0005603500 ovn_controller[95398]: 2026-01-31T06:41:32Z|00115|binding|INFO|Releasing lport ddeb4802-5c13-4b7c-9ab2-03de55b010c4 from this chassis (sb_readonly=0)
Jan 31 01:41:32 np0005603500 ovn_controller[95398]: 2026-01-31T06:41:32Z|00116|binding|INFO|Setting lport ddeb4802-5c13-4b7c-9ab2-03de55b010c4 down in Southbound
Jan 31 01:41:32 np0005603500 nova_compute[182934]: 2026-01-31 06:41:32.954 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:32 np0005603500 ovn_controller[95398]: 2026-01-31T06:41:32Z|00117|binding|INFO|Removing iface tapddeb4802-5c ovn-installed in OVS
Jan 31 01:41:32 np0005603500 nova_compute[182934]: 2026-01-31 06:41:32.957 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:32 np0005603500 nova_compute[182934]: 2026-01-31 06:41:32.960 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:32 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:32.970 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:d4:cd 10.100.0.19'], port_security=['fa:16:3e:eb:d4:cd 10.100.0.19'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28', 'neutron:device_id': '011d72bb-c539-4b1c-bd71-14adf765cc17', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e0e18fe-0b80-4494-a6db-546b6daf5fd2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7e9d8b57-e99d-441c-8208-450a673f23cf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07de4316-ef81-43cb-97c3-675146f1c643, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=ddeb4802-5c13-4b7c-9ab2-03de55b010c4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:41:32 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:32.972 104644 INFO neutron.agent.ovn.metadata.agent [-] Port ddeb4802-5c13-4b7c-9ab2-03de55b010c4 in datapath 8e0e18fe-0b80-4494-a6db-546b6daf5fd2 unbound from our chassis
Jan 31 01:41:32 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:32.973 104644 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e0e18fe-0b80-4494-a6db-546b6daf5fd2
Jan 31 01:41:32 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:32.984 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[963f6e2f-b3d8-4743-93a5-9f2908a44c9e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:41:32 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:32.996 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:41:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:33.004 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[95bffda4-ed37-4d17-b29c-bf53e02848f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:41:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:33.007 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[9e1f2473-1cc0-4736-bcd8-ebdc98d205f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:41:33 np0005603500 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Deactivated successfully.
Jan 31 01:41:33 np0005603500 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Consumed 13.641s CPU time.
Jan 31 01:41:33 np0005603500 systemd-machined[154375]: Machine qemu-7-instance-00000007 terminated.
Jan 31 01:41:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:33.025 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[9676a6fe-53d1-46b8-b625-ddbbae8b2415]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:41:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:33.037 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[7bc0c7c9-6b99-4072-834b-df3810060a1d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e0e18fe-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:d7:fb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 392769, 'reachable_time': 15961, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216089, 'error': None, 'target': 'ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:41:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:33.050 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[13e0a073-d391-4842-bb6c-0b512e62571b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tap8e0e18fe-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 392777, 'tstamp': 392777}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216090, 'error': None, 'target': 'ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8e0e18fe-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 392779, 'tstamp': 392779}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216090, 'error': None, 'target': 'ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:41:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:33.052 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e0e18fe-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.081 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.085 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:33.086 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e0e18fe-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:41:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:33.086 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:41:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:33.086 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e0e18fe-00, col_values=(('external_ids', {'iface-id': '38074b11-3b13-4706-9a79-123dd621d667'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:41:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:33.087 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:41:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:33.088 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[f9d5c426-3049-4e64-a73f-6263cb18b51a]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-8e0e18fe-0b80-4494-a6db-546b6daf5fd2\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/8e0e18fe-0b80-4494-a6db-546b6daf5fd2.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 8e0e18fe-0b80-4494-a6db-546b6daf5fd2\n') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.175 182938 INFO nova.virt.libvirt.driver [-] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Instance destroyed successfully.
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.176 182938 DEBUG nova.objects.instance [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'resources' on Instance uuid 011d72bb-c539-4b1c-bd71-14adf765cc17 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.311 182938 DEBUG nova.compute.manager [req-c45d6dc5-38c9-4823-93da-b795c5f43267 req-d046a618-bded-4912-a7b1-aa633977c908 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Received event network-vif-unplugged-ddeb4802-5c13-4b7c-9ab2-03de55b010c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.311 182938 DEBUG oslo_concurrency.lockutils [req-c45d6dc5-38c9-4823-93da-b795c5f43267 req-d046a618-bded-4912-a7b1-aa633977c908 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "011d72bb-c539-4b1c-bd71-14adf765cc17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.311 182938 DEBUG oslo_concurrency.lockutils [req-c45d6dc5-38c9-4823-93da-b795c5f43267 req-d046a618-bded-4912-a7b1-aa633977c908 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "011d72bb-c539-4b1c-bd71-14adf765cc17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.312 182938 DEBUG oslo_concurrency.lockutils [req-c45d6dc5-38c9-4823-93da-b795c5f43267 req-d046a618-bded-4912-a7b1-aa633977c908 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "011d72bb-c539-4b1c-bd71-14adf765cc17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.312 182938 DEBUG nova.compute.manager [req-c45d6dc5-38c9-4823-93da-b795c5f43267 req-d046a618-bded-4912-a7b1-aa633977c908 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] No waiting events found dispatching network-vif-unplugged-ddeb4802-5c13-4b7c-9ab2-03de55b010c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.312 182938 DEBUG nova.compute.manager [req-c45d6dc5-38c9-4823-93da-b795c5f43267 req-d046a618-bded-4912-a7b1-aa633977c908 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Received event network-vif-unplugged-ddeb4802-5c13-4b7c-9ab2-03de55b010c4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.682 182938 DEBUG nova.virt.libvirt.vif [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:40:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1010426789',display_name='tempest-TestNetworkBasicOps-server-1010426789',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1010426789',id=7,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA9hHTG0FgYJ4RDEfgLrvydcCLPpfNl926aCzkTSOPoigr7OR+EUaQrLzXRkEO6rzE65gx6hfCyIap+rU59sbLZkWobmbXEa5QfZeMW0CQGemHRakSUoqBbt+m0ZJ6Jidw==',key_name='tempest-TestNetworkBasicOps-753337127',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:40:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-k8dt423k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:40:59Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=011d72bb-c539-4b1c-bd71-14adf765cc17,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ddeb4802-5c13-4b7c-9ab2-03de55b010c4", "address": "fa:16:3e:eb:d4:cd", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddeb4802-5c", "ovs_interfaceid": "ddeb4802-5c13-4b7c-9ab2-03de55b010c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.682 182938 DEBUG nova.network.os_vif_util [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "ddeb4802-5c13-4b7c-9ab2-03de55b010c4", "address": "fa:16:3e:eb:d4:cd", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddeb4802-5c", "ovs_interfaceid": "ddeb4802-5c13-4b7c-9ab2-03de55b010c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.683 182938 DEBUG nova.network.os_vif_util [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:d4:cd,bridge_name='br-int',has_traffic_filtering=True,id=ddeb4802-5c13-4b7c-9ab2-03de55b010c4,network=Network(8e0e18fe-0b80-4494-a6db-546b6daf5fd2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddeb4802-5c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.683 182938 DEBUG os_vif [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:d4:cd,bridge_name='br-int',has_traffic_filtering=True,id=ddeb4802-5c13-4b7c-9ab2-03de55b010c4,network=Network(8e0e18fe-0b80-4494-a6db-546b6daf5fd2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddeb4802-5c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.686 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.686 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapddeb4802-5c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.688 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.690 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.691 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.691 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=0d977891-63d3-4eaa-a9b9-d25810149f73) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.692 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.693 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.693 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.693 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.696 182938 INFO os_vif [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:d4:cd,bridge_name='br-int',has_traffic_filtering=True,id=ddeb4802-5c13-4b7c-9ab2-03de55b010c4,network=Network(8e0e18fe-0b80-4494-a6db-546b6daf5fd2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddeb4802-5c')
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.697 182938 INFO nova.virt.libvirt.driver [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Deleting instance files /var/lib/nova/instances/011d72bb-c539-4b1c-bd71-14adf765cc17_del
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.698 182938 INFO nova.virt.libvirt.driver [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Deletion of /var/lib/nova/instances/011d72bb-c539-4b1c-bd71-14adf765cc17_del complete
Jan 31 01:41:33 np0005603500 nova_compute[182934]: 2026-01-31 06:41:33.714 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:34 np0005603500 nova_compute[182934]: 2026-01-31 06:41:34.209 182938 INFO nova.compute.manager [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Took 1.28 seconds to destroy the instance on the hypervisor.
Jan 31 01:41:34 np0005603500 nova_compute[182934]: 2026-01-31 06:41:34.210 182938 DEBUG oslo.service.backend.eventlet.loopingcall [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Jan 31 01:41:34 np0005603500 nova_compute[182934]: 2026-01-31 06:41:34.210 182938 DEBUG nova.compute.manager [-] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Jan 31 01:41:34 np0005603500 nova_compute[182934]: 2026-01-31 06:41:34.210 182938 DEBUG nova.network.neutron [-] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Jan 31 01:41:35 np0005603500 podman[216110]: 2026-01-31 06:41:35.129259008 +0000 UTC m=+0.050220274 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, distribution-scope=public, build-date=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 31 01:41:35 np0005603500 nova_compute[182934]: 2026-01-31 06:41:35.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:41:35 np0005603500 nova_compute[182934]: 2026-01-31 06:41:35.527 182938 DEBUG nova.compute.manager [req-bf3390d4-f809-423a-a4d5-cbebcc755310 req-f95c6a2f-67e0-40d1-a22f-7ecaf20b3773 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Received event network-vif-plugged-ddeb4802-5c13-4b7c-9ab2-03de55b010c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:41:35 np0005603500 nova_compute[182934]: 2026-01-31 06:41:35.527 182938 DEBUG oslo_concurrency.lockutils [req-bf3390d4-f809-423a-a4d5-cbebcc755310 req-f95c6a2f-67e0-40d1-a22f-7ecaf20b3773 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "011d72bb-c539-4b1c-bd71-14adf765cc17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:41:35 np0005603500 nova_compute[182934]: 2026-01-31 06:41:35.527 182938 DEBUG oslo_concurrency.lockutils [req-bf3390d4-f809-423a-a4d5-cbebcc755310 req-f95c6a2f-67e0-40d1-a22f-7ecaf20b3773 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "011d72bb-c539-4b1c-bd71-14adf765cc17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:41:35 np0005603500 nova_compute[182934]: 2026-01-31 06:41:35.527 182938 DEBUG oslo_concurrency.lockutils [req-bf3390d4-f809-423a-a4d5-cbebcc755310 req-f95c6a2f-67e0-40d1-a22f-7ecaf20b3773 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "011d72bb-c539-4b1c-bd71-14adf765cc17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:41:35 np0005603500 nova_compute[182934]: 2026-01-31 06:41:35.528 182938 DEBUG nova.compute.manager [req-bf3390d4-f809-423a-a4d5-cbebcc755310 req-f95c6a2f-67e0-40d1-a22f-7ecaf20b3773 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] No waiting events found dispatching network-vif-plugged-ddeb4802-5c13-4b7c-9ab2-03de55b010c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:41:35 np0005603500 nova_compute[182934]: 2026-01-31 06:41:35.528 182938 WARNING nova.compute.manager [req-bf3390d4-f809-423a-a4d5-cbebcc755310 req-f95c6a2f-67e0-40d1-a22f-7ecaf20b3773 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Received unexpected event network-vif-plugged-ddeb4802-5c13-4b7c-9ab2-03de55b010c4 for instance with vm_state active and task_state deleting.
Jan 31 01:41:36 np0005603500 nova_compute[182934]: 2026-01-31 06:41:36.013 182938 DEBUG nova.network.neutron [-] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:41:36 np0005603500 nova_compute[182934]: 2026-01-31 06:41:36.143 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:41:36 np0005603500 podman[216132]: 2026-01-31 06:41:36.157382737 +0000 UTC m=+0.071156652 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 31 01:41:36 np0005603500 nova_compute[182934]: 2026-01-31 06:41:36.519 182938 INFO nova.compute.manager [-] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Took 2.31 seconds to deallocate network for instance.
Jan 31 01:41:36 np0005603500 nova_compute[182934]: 2026-01-31 06:41:36.653 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:41:36 np0005603500 nova_compute[182934]: 2026-01-31 06:41:36.653 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:41:36 np0005603500 nova_compute[182934]: 2026-01-31 06:41:36.653 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:41:37 np0005603500 nova_compute[182934]: 2026-01-31 06:41:37.029 182938 DEBUG oslo_concurrency.lockutils [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:41:37 np0005603500 nova_compute[182934]: 2026-01-31 06:41:37.029 182938 DEBUG oslo_concurrency.lockutils [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:41:37 np0005603500 nova_compute[182934]: 2026-01-31 06:41:37.167 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:41:37 np0005603500 nova_compute[182934]: 2026-01-31 06:41:37.259 182938 DEBUG nova.scheduler.client.report [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Refreshing inventories for resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:822
Jan 31 01:41:37 np0005603500 nova_compute[182934]: 2026-01-31 06:41:37.509 182938 DEBUG nova.scheduler.client.report [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Updating ProviderTree inventory for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:786
Jan 31 01:41:37 np0005603500 nova_compute[182934]: 2026-01-31 06:41:37.510 182938 DEBUG nova.compute.provider_tree [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Updating inventory in ProviderTree for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 31 01:41:37 np0005603500 nova_compute[182934]: 2026-01-31 06:41:37.687 182938 DEBUG nova.compute.manager [req-12df5650-5dcb-45a5-94ae-a6696986ac39 req-2ba13dae-afd6-4b5f-b2cb-b67bc359ecb2 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 011d72bb-c539-4b1c-bd71-14adf765cc17] Received event network-vif-deleted-ddeb4802-5c13-4b7c-9ab2-03de55b010c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:41:38 np0005603500 nova_compute[182934]: 2026-01-31 06:41:38.014 182938 DEBUG nova.scheduler.client.report [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Refreshing aggregate associations for resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:831
Jan 31 01:41:38 np0005603500 nova_compute[182934]: 2026-01-31 06:41:38.037 182938 DEBUG nova.scheduler.client.report [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Refreshing trait associations for resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59, traits: COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,HW_CPU_X86_BMI2,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,HW_ARCH_X86_64,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_ABM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_CRB,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_SHA,HW_CPU_X86_CLMUL,HW_CPU_X86_AVX,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_TIS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_FMA3,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SVM,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:843
Jan 31 01:41:38 np0005603500 nova_compute[182934]: 2026-01-31 06:41:38.095 182938 DEBUG nova.compute.provider_tree [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:41:38 np0005603500 nova_compute[182934]: 2026-01-31 06:41:38.602 182938 DEBUG nova.scheduler.client.report [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:41:38 np0005603500 nova_compute[182934]: 2026-01-31 06:41:38.692 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:38 np0005603500 nova_compute[182934]: 2026-01-31 06:41:38.715 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:39 np0005603500 nova_compute[182934]: 2026-01-31 06:41:39.112 182938 DEBUG oslo_concurrency.lockutils [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:41:39 np0005603500 nova_compute[182934]: 2026-01-31 06:41:39.116 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 1.949s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:41:39 np0005603500 nova_compute[182934]: 2026-01-31 06:41:39.117 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:41:39 np0005603500 nova_compute[182934]: 2026-01-31 06:41:39.117 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:41:39 np0005603500 nova_compute[182934]: 2026-01-31 06:41:39.157 182938 INFO nova.scheduler.client.report [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Deleted allocations for instance 011d72bb-c539-4b1c-bd71-14adf765cc17
Jan 31 01:41:40 np0005603500 nova_compute[182934]: 2026-01-31 06:41:40.163 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:41:40 np0005603500 nova_compute[182934]: 2026-01-31 06:41:40.184 182938 DEBUG oslo_concurrency.lockutils [None req-7e8c73f8-a322-43ad-9de9-40ac6795887b dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "011d72bb-c539-4b1c-bd71-14adf765cc17" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.828s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:41:40 np0005603500 nova_compute[182934]: 2026-01-31 06:41:40.248 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:41:40 np0005603500 nova_compute[182934]: 2026-01-31 06:41:40.248 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:41:40 np0005603500 nova_compute[182934]: 2026-01-31 06:41:40.299 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:41:40 np0005603500 nova_compute[182934]: 2026-01-31 06:41:40.438 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:41:40 np0005603500 nova_compute[182934]: 2026-01-31 06:41:40.439 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5595MB free_disk=73.1824951171875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:41:40 np0005603500 nova_compute[182934]: 2026-01-31 06:41:40.439 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:41:40 np0005603500 nova_compute[182934]: 2026-01-31 06:41:40.440 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:41:41 np0005603500 nova_compute[182934]: 2026-01-31 06:41:41.477 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Instance 8515b99e-89c2-4a03-9a14-d6a0c3dca692 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Jan 31 01:41:41 np0005603500 nova_compute[182934]: 2026-01-31 06:41:41.478 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:41:41 np0005603500 nova_compute[182934]: 2026-01-31 06:41:41.478 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:41:41 np0005603500 nova_compute[182934]: 2026-01-31 06:41:41.524 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:41:42 np0005603500 nova_compute[182934]: 2026-01-31 06:41:42.031 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:41:42 np0005603500 nova_compute[182934]: 2026-01-31 06:41:42.395 182938 DEBUG oslo_concurrency.lockutils [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "interface-8515b99e-89c2-4a03-9a14-d6a0c3dca692-33eb9351-9906-4872-b960-8ca2037338a5" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:41:42 np0005603500 nova_compute[182934]: 2026-01-31 06:41:42.396 182938 DEBUG oslo_concurrency.lockutils [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "interface-8515b99e-89c2-4a03-9a14-d6a0c3dca692-33eb9351-9906-4872-b960-8ca2037338a5" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:41:42 np0005603500 nova_compute[182934]: 2026-01-31 06:41:42.542 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:41:42 np0005603500 nova_compute[182934]: 2026-01-31 06:41:42.543 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:41:42 np0005603500 nova_compute[182934]: 2026-01-31 06:41:42.907 182938 DEBUG nova.objects.instance [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'flavor' on Instance uuid 8515b99e-89c2-4a03-9a14-d6a0c3dca692 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.038 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.039 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.039 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:41:43 np0005603500 podman[216166]: 2026-01-31 06:41:43.150978841 +0000 UTC m=+0.057532087 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.417 182938 DEBUG nova.virt.libvirt.vif [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:39:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-3559488',display_name='tempest-TestNetworkBasicOps-server-3559488',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-3559488',id=6,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChYKDUjvhjCGs5+r/emnCrvtzxjRm0xGy2EtjvWZf5uSCdLdeFqDloA/lKvnHUNOd7mjbOXH0cnbc13Uy4w1Pw4Qnj3hkpXumiALlpO/vfZJol+WTWZCmQRTTzYN7UeJQ==',key_name='tempest-TestNetworkBasicOps-1595486681',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:39:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-gu629u0n',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:39:29Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=8515b99e-89c2-4a03-9a14-d6a0c3dca692,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "33eb9351-9906-4872-b960-8ca2037338a5", "address": "fa:16:3e:26:e4:61", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33eb9351-99", "ovs_interfaceid": "33eb9351-9906-4872-b960-8ca2037338a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.418 182938 DEBUG nova.network.os_vif_util [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "33eb9351-9906-4872-b960-8ca2037338a5", "address": "fa:16:3e:26:e4:61", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33eb9351-99", "ovs_interfaceid": "33eb9351-9906-4872-b960-8ca2037338a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.419 182938 DEBUG nova.network.os_vif_util [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:26:e4:61,bridge_name='br-int',has_traffic_filtering=True,id=33eb9351-9906-4872-b960-8ca2037338a5,network=Network(8e0e18fe-0b80-4494-a6db-546b6daf5fd2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33eb9351-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.423 182938 DEBUG nova.virt.libvirt.guest [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:26:e4:61"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap33eb9351-99"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.425 182938 DEBUG nova.virt.libvirt.guest [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:26:e4:61"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap33eb9351-99"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.427 182938 DEBUG nova.virt.libvirt.driver [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Attempting to detach device tap33eb9351-99 from instance 8515b99e-89c2-4a03-9a14-d6a0c3dca692 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2637
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.428 182938 DEBUG nova.virt.libvirt.guest [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] detach device xml: <interface type="ethernet">
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <mac address="fa:16:3e:26:e4:61"/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <model type="virtio"/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <mtu size="1442"/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <target dev="tap33eb9351-99"/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]: </interface>
Jan 31 01:41:43 np0005603500 nova_compute[182934]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:466
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.570 182938 DEBUG nova.virt.libvirt.guest [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:26:e4:61"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap33eb9351-99"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.575 182938 DEBUG nova.virt.libvirt.guest [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:26:e4:61"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap33eb9351-99"/></interface>not found in domain: <domain type='kvm' id='6'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <name>instance-00000006</name>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <uuid>8515b99e-89c2-4a03-9a14-d6a0c3dca692</uuid>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <nova:name>tempest-TestNetworkBasicOps-server-3559488</nova:name>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <nova:creationTime>2026-01-31 06:40:18</nova:creationTime>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <nova:flavor name="m1.nano">
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <nova:memory>128</nova:memory>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <nova:disk>1</nova:disk>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <nova:swap>0</nova:swap>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <nova:vcpus>1</nova:vcpus>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  </nova:flavor>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <nova:owner>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  </nova:owner>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <nova:ports>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <nova:port uuid="df57296c-8ac3-44b0-8fb6-9f85d0a93bdc">
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </nova:port>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <nova:port uuid="33eb9351-9906-4872-b960-8ca2037338a5">
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <nova:ip type="fixed" address="10.100.0.23" ipVersion="4"/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </nova:port>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  </nova:ports>
Jan 31 01:41:43 np0005603500 nova_compute[182934]: </nova:instance>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <memory unit='KiB'>131072</memory>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <vcpu placement='static'>1</vcpu>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <resource>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <partition>/machine</partition>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  </resource>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <sysinfo type='smbios'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <entry name='manufacturer'>RDO</entry>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <entry name='version'>31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <entry name='serial'>8515b99e-89c2-4a03-9a14-d6a0c3dca692</entry>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <entry name='uuid'>8515b99e-89c2-4a03-9a14-d6a0c3dca692</entry>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <entry name='family'>Virtual Machine</entry>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <boot dev='hd'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <smbios mode='sysinfo'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <vmcoreinfo state='on'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <cpu mode='custom' match='exact' check='full'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <model fallback='forbid'>EPYC-Rome</model>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <vendor>AMD</vendor>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='require' name='x2apic'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='require' name='tsc-deadline'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='require' name='hypervisor'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='require' name='tsc_adjust'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='require' name='spec-ctrl'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='require' name='stibp'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='require' name='ssbd'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='require' name='cmp_legacy'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='require' name='overflow-recov'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='require' name='succor'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='require' name='ibrs'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='require' name='amd-ssbd'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='require' name='virt-ssbd'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='disable' name='lbrv'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='disable' name='tsc-scale'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='disable' name='vmcb-clean'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='disable' name='flushbyasid'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='disable' name='pause-filter'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='disable' name='pfthreshold'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='disable' name='svme-addr-chk'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='require' name='lfence-always-serializing'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='disable' name='xsaves'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='disable' name='svm'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='require' name='topoext'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='disable' name='npt'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <feature policy='disable' name='nrip-save'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <clock offset='utc'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <timer name='hpet' present='no'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <on_poweroff>destroy</on_poweroff>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <on_reboot>restart</on_reboot>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <on_crash>destroy</on_crash>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <disk type='file' device='disk'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <source file='/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk' index='2'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <backingStore type='file' index='3'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:        <format type='raw'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:        <source file='/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:        <backingStore/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      </backingStore>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target dev='vda' bus='virtio'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='virtio-disk0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <disk type='file' device='cdrom'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <source file='/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.config' index='1'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <backingStore/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target dev='sda' bus='sata'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <readonly/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='sata0-0-0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='0' model='pcie-root'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pcie.0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='1' port='0x10'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.1'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='2' port='0x11'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.2'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='3' port='0x12'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.3'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='4' port='0x13'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.4'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='5' port='0x14'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.5'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='6' port='0x15'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.6'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='7' port='0x16'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.7'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='8' port='0x17'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.8'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='9' port='0x18'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.9'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='10' port='0x19'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.10'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='11' port='0x1a'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.11'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='12' port='0x1b'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.12'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='13' port='0x1c'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.13'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='14' port='0x1d'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.14'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='15' port='0x1e'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.15'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='16' port='0x1f'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.16'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='17' port='0x20'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.17'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='18' port='0x21'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.18'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='19' port='0x22'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.19'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='20' port='0x23'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.20'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='21' port='0x24'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.21'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='22' port='0x25'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.22'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='23' port='0x26'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.23'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='24' port='0x27'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.24'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target chassis='25' port='0x28'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.25'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model name='pcie-pci-bridge'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='pci.26'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='usb'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <controller type='sata' index='0'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='ide'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <interface type='ethernet'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <mac address='fa:16:3e:bf:6a:c9'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target dev='tapdf57296c-8a'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model type='virtio'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <mtu size='1442'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='net0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <interface type='ethernet'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <mac address='fa:16:3e:26:e4:61'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target dev='tap33eb9351-99'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model type='virtio'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <mtu size='1442'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='net1'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <serial type='pty'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <source path='/dev/pts/0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <log file='/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/console.log' append='off'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target type='isa-serial' port='0'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:        <model name='isa-serial'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      </target>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='serial0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <console type='pty' tty='/dev/pts/0'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <source path='/dev/pts/0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <log file='/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/console.log' append='off'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <target type='serial' port='0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='serial0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </console>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <input type='tablet' bus='usb'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='input0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='usb' bus='0' port='1'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <input type='mouse' bus='ps2'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='input1'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <input type='keyboard' bus='ps2'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='input2'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <listen type='address' address='::0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </graphics>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <audio id='1' type='none'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='video0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <watchdog model='itco' action='reset'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='watchdog0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </watchdog>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <memballoon model='virtio'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <stats period='10'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='balloon0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <rng model='virtio'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <backend model='random'>/dev/urandom</backend>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <alias name='rng0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <label>system_u:system_r:svirt_t:s0:c103,c879</label>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c103,c879</imagelabel>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  </seclabel>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <label>+107:+107</label>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:    <imagelabel>+107:+107</imagelabel>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  </seclabel>
Jan 31 01:41:43 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:41:43 np0005603500 nova_compute[182934]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.577 182938 INFO nova.virt.libvirt.driver [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully detached device tap33eb9351-99 from instance 8515b99e-89c2-4a03-9a14-d6a0c3dca692 from the persistent domain config.
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.577 182938 DEBUG nova.virt.libvirt.driver [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] (1/8): Attempting to detach device tap33eb9351-99 with device alias net1 from instance 8515b99e-89c2-4a03-9a14-d6a0c3dca692 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2673
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.578 182938 DEBUG nova.virt.libvirt.guest [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] detach device xml: <interface type="ethernet">
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <mac address="fa:16:3e:26:e4:61"/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <model type="virtio"/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <mtu size="1442"/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]:  <target dev="tap33eb9351-99"/>
Jan 31 01:41:43 np0005603500 nova_compute[182934]: </interface>
Jan 31 01:41:43 np0005603500 nova_compute[182934]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:466
Jan 31 01:41:43 np0005603500 kernel: tap33eb9351-99 (unregistering): left promiscuous mode
Jan 31 01:41:43 np0005603500 NetworkManager[55506]: <info>  [1769841703.6687] device (tap33eb9351-99): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 01:41:43 np0005603500 ovn_controller[95398]: 2026-01-31T06:41:43Z|00118|binding|INFO|Releasing lport 33eb9351-9906-4872-b960-8ca2037338a5 from this chassis (sb_readonly=0)
Jan 31 01:41:43 np0005603500 ovn_controller[95398]: 2026-01-31T06:41:43Z|00119|binding|INFO|Setting lport 33eb9351-9906-4872-b960-8ca2037338a5 down in Southbound
Jan 31 01:41:43 np0005603500 ovn_controller[95398]: 2026-01-31T06:41:43Z|00120|binding|INFO|Removing iface tap33eb9351-99 ovn-installed in OVS
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.673 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.675 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.679 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.682 182938 DEBUG nova.virt.libvirt.driver [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Start waiting for the detach event from libvirt for device tap33eb9351-99 with device alias net1 for instance 8515b99e-89c2-4a03-9a14-d6a0c3dca692 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2749
Jan 31 01:41:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:43.682 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:e4:61 10.100.0.23', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.23/28', 'neutron:device_id': '8515b99e-89c2-4a03-9a14-d6a0c3dca692', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e0e18fe-0b80-4494-a6db-546b6daf5fd2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07de4316-ef81-43cb-97c3-675146f1c643, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=33eb9351-9906-4872-b960-8ca2037338a5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:41:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:43.684 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 33eb9351-9906-4872-b960-8ca2037338a5 in datapath 8e0e18fe-0b80-4494-a6db-546b6daf5fd2 unbound from our chassis
Jan 31 01:41:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:43.687 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e0e18fe-0b80-4494-a6db-546b6daf5fd2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:41:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:43.688 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[a89f1f63-822c-4210-bf8e-88390119f065]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:41:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:43.689 104644 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2 namespace which is not needed anymore
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.695 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:43 np0005603500 nova_compute[182934]: 2026-01-31 06:41:43.716 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:43 np0005603500 neutron-haproxy-ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2[215672]: [NOTICE]   (215676) : haproxy version is 2.8.14-c23fe91
Jan 31 01:41:43 np0005603500 neutron-haproxy-ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2[215672]: [NOTICE]   (215676) : path to executable is /usr/sbin/haproxy
Jan 31 01:41:43 np0005603500 neutron-haproxy-ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2[215672]: [WARNING]  (215676) : Exiting Master process...
Jan 31 01:41:43 np0005603500 podman[216210]: 2026-01-31 06:41:43.802541313 +0000 UTC m=+0.028415398 container kill c17f83c4fbcc1bff6cfeb306b8050dee20f534b1bb7dd349b2da63269e88254a (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true)
Jan 31 01:41:43 np0005603500 neutron-haproxy-ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2[215672]: [ALERT]    (215676) : Current worker (215678) exited with code 143 (Terminated)
Jan 31 01:41:43 np0005603500 neutron-haproxy-ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2[215672]: [WARNING]  (215676) : All workers exited. Exiting... (0)
Jan 31 01:41:43 np0005603500 systemd[1]: libpod-c17f83c4fbcc1bff6cfeb306b8050dee20f534b1bb7dd349b2da63269e88254a.scope: Deactivated successfully.
Jan 31 01:41:44 np0005603500 nova_compute[182934]: 2026-01-31 06:41:44.031 182938 DEBUG nova.compute.manager [req-15349975-71e5-4e7b-b6b0-1b96fd8575a8 req-ba48b0e4-9a20-4928-89be-9741afe21fc9 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received event network-vif-unplugged-33eb9351-9906-4872-b960-8ca2037338a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:41:44 np0005603500 nova_compute[182934]: 2026-01-31 06:41:44.032 182938 DEBUG oslo_concurrency.lockutils [req-15349975-71e5-4e7b-b6b0-1b96fd8575a8 req-ba48b0e4-9a20-4928-89be-9741afe21fc9 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:41:44 np0005603500 nova_compute[182934]: 2026-01-31 06:41:44.032 182938 DEBUG oslo_concurrency.lockutils [req-15349975-71e5-4e7b-b6b0-1b96fd8575a8 req-ba48b0e4-9a20-4928-89be-9741afe21fc9 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:41:44 np0005603500 nova_compute[182934]: 2026-01-31 06:41:44.032 182938 DEBUG oslo_concurrency.lockutils [req-15349975-71e5-4e7b-b6b0-1b96fd8575a8 req-ba48b0e4-9a20-4928-89be-9741afe21fc9 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:41:44 np0005603500 nova_compute[182934]: 2026-01-31 06:41:44.032 182938 DEBUG nova.compute.manager [req-15349975-71e5-4e7b-b6b0-1b96fd8575a8 req-ba48b0e4-9a20-4928-89be-9741afe21fc9 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] No waiting events found dispatching network-vif-unplugged-33eb9351-9906-4872-b960-8ca2037338a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:41:44 np0005603500 nova_compute[182934]: 2026-01-31 06:41:44.032 182938 WARNING nova.compute.manager [req-15349975-71e5-4e7b-b6b0-1b96fd8575a8 req-ba48b0e4-9a20-4928-89be-9741afe21fc9 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received unexpected event network-vif-unplugged-33eb9351-9906-4872-b960-8ca2037338a5 for instance with vm_state active and task_state None.
Jan 31 01:41:44 np0005603500 podman[216238]: 2026-01-31 06:41:44.118453794 +0000 UTC m=+0.023386917 container died c17f83c4fbcc1bff6cfeb306b8050dee20f534b1bb7dd349b2da63269e88254a (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 01:41:44 np0005603500 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c17f83c4fbcc1bff6cfeb306b8050dee20f534b1bb7dd349b2da63269e88254a-userdata-shm.mount: Deactivated successfully.
Jan 31 01:41:44 np0005603500 systemd[1]: var-lib-containers-storage-overlay-4fd377f790dd66126ef63ff528e5db2accba3de3e5aeafaafeffc9138a0dd798-merged.mount: Deactivated successfully.
Jan 31 01:41:44 np0005603500 podman[216254]: 2026-01-31 06:41:44.75505748 +0000 UTC m=+0.186695200 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 31 01:41:45 np0005603500 podman[216238]: 2026-01-31 06:41:45.71429829 +0000 UTC m=+1.619231423 container remove c17f83c4fbcc1bff6cfeb306b8050dee20f534b1bb7dd349b2da63269e88254a (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:41:45 np0005603500 systemd[1]: libpod-conmon-c17f83c4fbcc1bff6cfeb306b8050dee20f534b1bb7dd349b2da63269e88254a.scope: Deactivated successfully.
Jan 31 01:41:45 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:45.719 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[7775aba3-fb2b-481c-83be-fed256a40b91]: (4, ("Sat Jan 31 06:41:43 AM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2 (c17f83c4fbcc1bff6cfeb306b8050dee20f534b1bb7dd349b2da63269e88254a)\nc17f83c4fbcc1bff6cfeb306b8050dee20f534b1bb7dd349b2da63269e88254a\nSat Jan 31 06:41:44 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2 (c17f83c4fbcc1bff6cfeb306b8050dee20f534b1bb7dd349b2da63269e88254a)\nc17f83c4fbcc1bff6cfeb306b8050dee20f534b1bb7dd349b2da63269e88254a\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:41:45 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:45.721 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[5ee92099-de48-4e23-a4c0-d1f55f56948e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:41:45 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:45.721 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8e0e18fe-0b80-4494-a6db-546b6daf5fd2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8e0e18fe-0b80-4494-a6db-546b6daf5fd2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:41:45 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:45.722 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[c0a1c88b-591b-46c3-82e9-564b776a5e0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:41:45 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:45.722 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e0e18fe-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:41:45 np0005603500 nova_compute[182934]: 2026-01-31 06:41:45.759 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:45 np0005603500 kernel: tap8e0e18fe-00: left promiscuous mode
Jan 31 01:41:45 np0005603500 nova_compute[182934]: 2026-01-31 06:41:45.768 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:45 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:45.774 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[66da73ce-93f6-4b24-9c59-11976662223f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:41:45 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:45.787 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[0bc4a404-e422-4da6-919f-6824c6cd767c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:41:45 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:45.789 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[ab01090d-3ab4-47e2-88eb-760eaeb06570]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:41:45 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:45.801 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[cece9c6e-e150-49de-bc26-9cb723c61040]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 392765, 'reachable_time': 31204, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216286, 'error': None, 'target': 'ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:41:45 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:45.803 105168 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8e0e18fe-0b80-4494-a6db-546b6daf5fd2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 31 01:41:45 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:45.803 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[3a1760bf-351c-443a-9b6e-8e0745aee570]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:41:45 np0005603500 systemd[1]: run-netns-ovnmeta\x2d8e0e18fe\x2d0b80\x2d4494\x2da6db\x2d546b6daf5fd2.mount: Deactivated successfully.
Jan 31 01:41:46 np0005603500 nova_compute[182934]: 2026-01-31 06:41:46.257 182938 DEBUG nova.compute.manager [req-b7d2296b-49d4-45c3-ae8a-5fdd2f69d06a req-979614e6-cd1e-4223-82a3-c278ba1e6c7a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received event network-vif-plugged-33eb9351-9906-4872-b960-8ca2037338a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:41:46 np0005603500 nova_compute[182934]: 2026-01-31 06:41:46.257 182938 DEBUG oslo_concurrency.lockutils [req-b7d2296b-49d4-45c3-ae8a-5fdd2f69d06a req-979614e6-cd1e-4223-82a3-c278ba1e6c7a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:41:46 np0005603500 nova_compute[182934]: 2026-01-31 06:41:46.257 182938 DEBUG oslo_concurrency.lockutils [req-b7d2296b-49d4-45c3-ae8a-5fdd2f69d06a req-979614e6-cd1e-4223-82a3-c278ba1e6c7a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:41:46 np0005603500 nova_compute[182934]: 2026-01-31 06:41:46.257 182938 DEBUG oslo_concurrency.lockutils [req-b7d2296b-49d4-45c3-ae8a-5fdd2f69d06a req-979614e6-cd1e-4223-82a3-c278ba1e6c7a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:41:46 np0005603500 nova_compute[182934]: 2026-01-31 06:41:46.257 182938 DEBUG nova.compute.manager [req-b7d2296b-49d4-45c3-ae8a-5fdd2f69d06a req-979614e6-cd1e-4223-82a3-c278ba1e6c7a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] No waiting events found dispatching network-vif-plugged-33eb9351-9906-4872-b960-8ca2037338a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:41:46 np0005603500 nova_compute[182934]: 2026-01-31 06:41:46.258 182938 WARNING nova.compute.manager [req-b7d2296b-49d4-45c3-ae8a-5fdd2f69d06a req-979614e6-cd1e-4223-82a3-c278ba1e6c7a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received unexpected event network-vif-plugged-33eb9351-9906-4872-b960-8ca2037338a5 for instance with vm_state active and task_state None.
Jan 31 01:41:48 np0005603500 nova_compute[182934]: 2026-01-31 06:41:48.697 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:48 np0005603500 nova_compute[182934]: 2026-01-31 06:41:48.717 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:53 np0005603500 nova_compute[182934]: 2026-01-31 06:41:53.700 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:53 np0005603500 nova_compute[182934]: 2026-01-31 06:41:53.719 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:56.030 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:41:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:56.031 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:41:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:41:56.031 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:41:57 np0005603500 podman[216288]: 2026-01-31 06:41:57.158911174 +0000 UTC m=+0.074983014 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 31 01:41:57 np0005603500 podman[216289]: 2026-01-31 06:41:57.16597651 +0000 UTC m=+0.077806595 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:41:58 np0005603500 nova_compute[182934]: 2026-01-31 06:41:58.703 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:41:58 np0005603500 nova_compute[182934]: 2026-01-31 06:41:58.720 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:03 np0005603500 nova_compute[182934]: 2026-01-31 06:42:03.683 182938 WARNING nova.virt.libvirt.driver [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Waiting for libvirt event about the detach of device tap33eb9351-99 with device alias net1 from instance 8515b99e-89c2-4a03-9a14-d6a0c3dca692 is timed out.
Jan 31 01:42:03 np0005603500 nova_compute[182934]: 2026-01-31 06:42:03.683 182938 DEBUG nova.virt.libvirt.guest [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:26:e4:61"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap33eb9351-99"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 31 01:42:03 np0005603500 nova_compute[182934]: 2026-01-31 06:42:03.686 182938 DEBUG nova.virt.libvirt.guest [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:26:e4:61"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap33eb9351-99"/></interface>not found in domain: <domain type='kvm' id='6'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <name>instance-00000006</name>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <uuid>8515b99e-89c2-4a03-9a14-d6a0c3dca692</uuid>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <nova:name>tempest-TestNetworkBasicOps-server-3559488</nova:name>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <nova:creationTime>2026-01-31 06:40:18</nova:creationTime>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <nova:flavor name="m1.nano">
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <nova:memory>128</nova:memory>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <nova:disk>1</nova:disk>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <nova:swap>0</nova:swap>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <nova:vcpus>1</nova:vcpus>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  </nova:flavor>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <nova:owner>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  </nova:owner>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <nova:ports>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <nova:port uuid="df57296c-8ac3-44b0-8fb6-9f85d0a93bdc">
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </nova:port>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <nova:port uuid="33eb9351-9906-4872-b960-8ca2037338a5">
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <nova:ip type="fixed" address="10.100.0.23" ipVersion="4"/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </nova:port>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  </nova:ports>
Jan 31 01:42:03 np0005603500 nova_compute[182934]: </nova:instance>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <memory unit='KiB'>131072</memory>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <vcpu placement='static'>1</vcpu>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <resource>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <partition>/machine</partition>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  </resource>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <sysinfo type='smbios'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <entry name='manufacturer'>RDO</entry>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <entry name='version'>31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <entry name='serial'>8515b99e-89c2-4a03-9a14-d6a0c3dca692</entry>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <entry name='uuid'>8515b99e-89c2-4a03-9a14-d6a0c3dca692</entry>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <entry name='family'>Virtual Machine</entry>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <boot dev='hd'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <smbios mode='sysinfo'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <vmcoreinfo state='on'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <cpu mode='custom' match='exact' check='full'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <model fallback='forbid'>EPYC-Rome</model>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <vendor>AMD</vendor>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='require' name='x2apic'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='require' name='tsc-deadline'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='require' name='hypervisor'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='require' name='tsc_adjust'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='require' name='spec-ctrl'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='require' name='stibp'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='require' name='ssbd'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='require' name='cmp_legacy'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='require' name='overflow-recov'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='require' name='succor'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='require' name='ibrs'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='require' name='amd-ssbd'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='require' name='virt-ssbd'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='disable' name='lbrv'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='disable' name='tsc-scale'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='disable' name='vmcb-clean'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='disable' name='flushbyasid'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='disable' name='pause-filter'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='disable' name='pfthreshold'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='disable' name='svme-addr-chk'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='require' name='lfence-always-serializing'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='disable' name='xsaves'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='disable' name='svm'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='require' name='topoext'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='disable' name='npt'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <feature policy='disable' name='nrip-save'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <clock offset='utc'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <timer name='hpet' present='no'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <on_poweroff>destroy</on_poweroff>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <on_reboot>restart</on_reboot>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <on_crash>destroy</on_crash>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <disk type='file' device='disk'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <source file='/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk' index='2'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <backingStore type='file' index='3'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:        <format type='raw'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:        <source file='/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:        <backingStore/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      </backingStore>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target dev='vda' bus='virtio'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='virtio-disk0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <disk type='file' device='cdrom'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <source file='/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.config' index='1'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <backingStore/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target dev='sda' bus='sata'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <readonly/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='sata0-0-0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='0' model='pcie-root'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pcie.0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='1' port='0x10'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.1'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='2' port='0x11'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.2'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='3' port='0x12'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.3'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='4' port='0x13'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.4'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='5' port='0x14'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.5'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='6' port='0x15'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.6'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='7' port='0x16'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.7'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='8' port='0x17'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.8'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='9' port='0x18'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.9'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='10' port='0x19'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.10'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='11' port='0x1a'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.11'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='12' port='0x1b'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.12'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='13' port='0x1c'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.13'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='14' port='0x1d'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.14'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='15' port='0x1e'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.15'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='16' port='0x1f'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.16'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='17' port='0x20'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.17'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='18' port='0x21'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.18'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='19' port='0x22'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.19'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='20' port='0x23'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.20'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='21' port='0x24'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.21'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='22' port='0x25'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.22'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='23' port='0x26'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.23'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='24' port='0x27'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.24'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target chassis='25' port='0x28'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.25'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model name='pcie-pci-bridge'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='pci.26'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='usb'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <controller type='sata' index='0'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='ide'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <interface type='ethernet'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <mac address='fa:16:3e:bf:6a:c9'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target dev='tapdf57296c-8a'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model type='virtio'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <mtu size='1442'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='net0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <serial type='pty'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <source path='/dev/pts/0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <log file='/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/console.log' append='off'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target type='isa-serial' port='0'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:        <model name='isa-serial'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      </target>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='serial0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <console type='pty' tty='/dev/pts/0'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <source path='/dev/pts/0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <log file='/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/console.log' append='off'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <target type='serial' port='0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='serial0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </console>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <input type='tablet' bus='usb'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='input0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='usb' bus='0' port='1'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <input type='mouse' bus='ps2'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='input1'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <input type='keyboard' bus='ps2'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='input2'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <listen type='address' address='::0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </graphics>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <audio id='1' type='none'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='video0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <watchdog model='itco' action='reset'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='watchdog0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </watchdog>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <memballoon model='virtio'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <stats period='10'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='balloon0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <rng model='virtio'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <backend model='random'>/dev/urandom</backend>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <alias name='rng0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <label>system_u:system_r:svirt_t:s0:c103,c879</label>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c103,c879</imagelabel>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  </seclabel>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <label>+107:+107</label>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <imagelabel>+107:+107</imagelabel>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  </seclabel>
Jan 31 01:42:03 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:42:03 np0005603500 nova_compute[182934]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 31 01:42:03 np0005603500 nova_compute[182934]: 2026-01-31 06:42:03.687 182938 INFO nova.virt.libvirt.driver [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully detached device tap33eb9351-99 from instance 8515b99e-89c2-4a03-9a14-d6a0c3dca692 from the live domain config.
Jan 31 01:42:03 np0005603500 nova_compute[182934]: 2026-01-31 06:42:03.688 182938 DEBUG nova.virt.libvirt.vif [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:39:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-3559488',display_name='tempest-TestNetworkBasicOps-server-3559488',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-3559488',id=6,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChYKDUjvhjCGs5+r/emnCrvtzxjRm0xGy2EtjvWZf5uSCdLdeFqDloA/lKvnHUNOd7mjbOXH0cnbc13Uy4w1Pw4Qnj3hkpXumiALlpO/vfZJol+WTWZCmQRTTzYN7UeJQ==',key_name='tempest-TestNetworkBasicOps-1595486681',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:39:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-gu629u0n',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:39:29Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=8515b99e-89c2-4a03-9a14-d6a0c3dca692,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "33eb9351-9906-4872-b960-8ca2037338a5", "address": "fa:16:3e:26:e4:61", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33eb9351-99", "ovs_interfaceid": "33eb9351-9906-4872-b960-8ca2037338a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Jan 31 01:42:03 np0005603500 nova_compute[182934]: 2026-01-31 06:42:03.689 182938 DEBUG nova.network.os_vif_util [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "33eb9351-9906-4872-b960-8ca2037338a5", "address": "fa:16:3e:26:e4:61", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33eb9351-99", "ovs_interfaceid": "33eb9351-9906-4872-b960-8ca2037338a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:42:03 np0005603500 nova_compute[182934]: 2026-01-31 06:42:03.689 182938 DEBUG nova.network.os_vif_util [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:26:e4:61,bridge_name='br-int',has_traffic_filtering=True,id=33eb9351-9906-4872-b960-8ca2037338a5,network=Network(8e0e18fe-0b80-4494-a6db-546b6daf5fd2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33eb9351-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:42:03 np0005603500 nova_compute[182934]: 2026-01-31 06:42:03.690 182938 DEBUG os_vif [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:e4:61,bridge_name='br-int',has_traffic_filtering=True,id=33eb9351-9906-4872-b960-8ca2037338a5,network=Network(8e0e18fe-0b80-4494-a6db-546b6daf5fd2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33eb9351-99') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 31 01:42:03 np0005603500 nova_compute[182934]: 2026-01-31 06:42:03.691 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:03 np0005603500 nova_compute[182934]: 2026-01-31 06:42:03.692 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap33eb9351-99, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:42:03 np0005603500 nova_compute[182934]: 2026-01-31 06:42:03.693 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:03 np0005603500 nova_compute[182934]: 2026-01-31 06:42:03.697 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:42:03 np0005603500 nova_compute[182934]: 2026-01-31 06:42:03.698 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:03 np0005603500 nova_compute[182934]: 2026-01-31 06:42:03.698 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=c0228587-d3cb-4ae5-b27b-9ddc8f16b4a3) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:42:03 np0005603500 nova_compute[182934]: 2026-01-31 06:42:03.700 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:03 np0005603500 nova_compute[182934]: 2026-01-31 06:42:03.703 182938 INFO os_vif [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:e4:61,bridge_name='br-int',has_traffic_filtering=True,id=33eb9351-9906-4872-b960-8ca2037338a5,network=Network(8e0e18fe-0b80-4494-a6db-546b6daf5fd2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33eb9351-99')
Jan 31 01:42:03 np0005603500 nova_compute[182934]: 2026-01-31 06:42:03.704 182938 DEBUG nova.virt.driver [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-3559488', uuid='8515b99e-89c2-4a03-9a14-d6a0c3dca692'), owner=OwnerMeta(userid='dddc34b0385a49a5bd9bf081ed29e9fd', username='tempest-TestNetworkBasicOps-1355800406-project-member', projectid='829310cd8381494e96216dba067ff8d3', projectname='tempest-TestNetworkBasicOps-1355800406'), image=ImageMeta(id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus='sata',hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus='virtio',hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus='usb',hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type='q35',hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model='usbtablet',hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model='virtio',hw_video_ram=<?>,hw_vif_model='virtio',hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "address": "fa:16:3e:bf:6a:c9", "network": {"id": "719b4b2c-8a13-4225-a6bd-071da9c7ca99", "bridge": "br-int", "label": "tempest-network-smoke--720963365", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf57296c-8a", "ovs_interfaceid": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1769841723.7044406) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Jan 31 01:42:03 np0005603500 nova_compute[182934]: 2026-01-31 06:42:03.705 182938 DEBUG nova.virt.libvirt.guest [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <nova:name>tempest-TestNetworkBasicOps-server-3559488</nova:name>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <nova:creationTime>2026-01-31 06:42:03</nova:creationTime>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <nova:flavor name="m1.nano">
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <nova:memory>128</nova:memory>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <nova:disk>1</nova:disk>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <nova:swap>0</nova:swap>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <nova:vcpus>1</nova:vcpus>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  </nova:flavor>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <nova:owner>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  </nova:owner>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  <nova:ports>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    <nova:port uuid="df57296c-8ac3-44b0-8fb6-9f85d0a93bdc">
Jan 31 01:42:03 np0005603500 nova_compute[182934]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:    </nova:port>
Jan 31 01:42:03 np0005603500 nova_compute[182934]:  </nova:ports>
Jan 31 01:42:03 np0005603500 nova_compute[182934]: </nova:instance>
Jan 31 01:42:03 np0005603500 nova_compute[182934]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:356
Jan 31 01:42:03 np0005603500 nova_compute[182934]: 2026-01-31 06:42:03.723 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:05 np0005603500 nova_compute[182934]: 2026-01-31 06:42:05.037 182938 DEBUG nova.compute.manager [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received event network-vif-deleted-33eb9351-9906-4872-b960-8ca2037338a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:42:05 np0005603500 nova_compute[182934]: 2026-01-31 06:42:05.037 182938 INFO nova.compute.manager [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Neutron deleted interface 33eb9351-9906-4872-b960-8ca2037338a5; detaching it from the instance and deleting it from the info cache
Jan 31 01:42:05 np0005603500 nova_compute[182934]: 2026-01-31 06:42:05.037 182938 DEBUG nova.network.neutron [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Updating instance_info_cache with network_info: [{"id": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "address": "fa:16:3e:bf:6a:c9", "network": {"id": "719b4b2c-8a13-4225-a6bd-071da9c7ca99", "bridge": "br-int", "label": "tempest-network-smoke--720963365", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf57296c-8a", "ovs_interfaceid": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:42:05 np0005603500 nova_compute[182934]: 2026-01-31 06:42:05.547 182938 DEBUG nova.objects.instance [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lazy-loading 'system_metadata' on Instance uuid 8515b99e-89c2-4a03-9a14-d6a0c3dca692 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:42:05 np0005603500 nova_compute[182934]: 2026-01-31 06:42:05.835 182938 DEBUG oslo_concurrency.lockutils [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:42:05 np0005603500 nova_compute[182934]: 2026-01-31 06:42:05.836 182938 DEBUG oslo_concurrency.lockutils [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquired lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:42:05 np0005603500 nova_compute[182934]: 2026-01-31 06:42:05.836 182938 DEBUG nova.network.neutron [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.054 182938 DEBUG nova.objects.instance [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lazy-loading 'flavor' on Instance uuid 8515b99e-89c2-4a03-9a14-d6a0c3dca692 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:42:06 np0005603500 podman[216331]: 2026-01-31 06:42:06.131538505 +0000 UTC m=+0.047494956 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 31 01:42:06 np0005603500 ovn_controller[95398]: 2026-01-31T06:42:06Z|00121|binding|INFO|Releasing lport 2f0d159b-8dc8-4f02-92d2-05f0b9cc7315 from this chassis (sb_readonly=0)
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.291 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.561 182938 DEBUG nova.objects.base [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Object Instance<8515b99e-89c2-4a03-9a14-d6a0c3dca692> lazy-loaded attributes: system_metadata,flavor wrapper /usr/lib/python3.9/site-packages/nova/objects/base.py:136
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.562 182938 DEBUG nova.virt.libvirt.vif [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:39:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-3559488',display_name='tempest-TestNetworkBasicOps-server-3559488',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-3559488',id=6,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChYKDUjvhjCGs5+r/emnCrvtzxjRm0xGy2EtjvWZf5uSCdLdeFqDloA/lKvnHUNOd7mjbOXH0cnbc13Uy4w1Pw4Qnj3hkpXumiALlpO/vfZJol+WTWZCmQRTTzYN7UeJQ==',key_name='tempest-TestNetworkBasicOps-1595486681',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:39:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-gu629u0n',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:39:29Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=8515b99e-89c2-4a03-9a14-d6a0c3dca692,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "33eb9351-9906-4872-b960-8ca2037338a5", "address": "fa:16:3e:26:e4:61", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33eb9351-99", "ovs_interfaceid": "33eb9351-9906-4872-b960-8ca2037338a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.562 182938 DEBUG nova.network.os_vif_util [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Converting VIF {"id": "33eb9351-9906-4872-b960-8ca2037338a5", "address": "fa:16:3e:26:e4:61", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33eb9351-99", "ovs_interfaceid": "33eb9351-9906-4872-b960-8ca2037338a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.563 182938 DEBUG nova.network.os_vif_util [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:26:e4:61,bridge_name='br-int',has_traffic_filtering=True,id=33eb9351-9906-4872-b960-8ca2037338a5,network=Network(8e0e18fe-0b80-4494-a6db-546b6daf5fd2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33eb9351-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.565 182938 DEBUG nova.virt.libvirt.guest [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:26:e4:61"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap33eb9351-99"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.569 182938 DEBUG nova.virt.libvirt.guest [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:26:e4:61"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap33eb9351-99"/></interface>not found in domain: <domain type='kvm' id='6'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <name>instance-00000006</name>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <uuid>8515b99e-89c2-4a03-9a14-d6a0c3dca692</uuid>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:name>tempest-TestNetworkBasicOps-server-3559488</nova:name>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:creationTime>2026-01-31 06:42:03</nova:creationTime>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:flavor name="m1.nano">
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:memory>128</nova:memory>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:disk>1</nova:disk>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:swap>0</nova:swap>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:vcpus>1</nova:vcpus>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </nova:flavor>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:owner>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </nova:owner>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:ports>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:port uuid="df57296c-8ac3-44b0-8fb6-9f85d0a93bdc">
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </nova:port>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </nova:ports>
Jan 31 01:42:06 np0005603500 nova_compute[182934]: </nova:instance>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <memory unit='KiB'>131072</memory>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <vcpu placement='static'>1</vcpu>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <resource>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <partition>/machine</partition>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </resource>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <sysinfo type='smbios'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <entry name='manufacturer'>RDO</entry>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <entry name='version'>31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <entry name='serial'>8515b99e-89c2-4a03-9a14-d6a0c3dca692</entry>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <entry name='uuid'>8515b99e-89c2-4a03-9a14-d6a0c3dca692</entry>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <entry name='family'>Virtual Machine</entry>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <boot dev='hd'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <smbios mode='sysinfo'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <vmcoreinfo state='on'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <cpu mode='custom' match='exact' check='full'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <model fallback='forbid'>EPYC-Rome</model>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <vendor>AMD</vendor>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='x2apic'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='tsc-deadline'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='hypervisor'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='tsc_adjust'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='spec-ctrl'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='stibp'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='ssbd'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='cmp_legacy'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='overflow-recov'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='succor'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='ibrs'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='amd-ssbd'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='virt-ssbd'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='lbrv'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='tsc-scale'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='vmcb-clean'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='flushbyasid'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='pause-filter'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='pfthreshold'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='svme-addr-chk'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='lfence-always-serializing'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='xsaves'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='svm'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='topoext'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='npt'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='nrip-save'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <clock offset='utc'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <timer name='hpet' present='no'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <on_poweroff>destroy</on_poweroff>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <on_reboot>restart</on_reboot>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <on_crash>destroy</on_crash>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <disk type='file' device='disk'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <source file='/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk' index='2'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <backingStore type='file' index='3'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:        <format type='raw'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:        <source file='/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:        <backingStore/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      </backingStore>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target dev='vda' bus='virtio'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='virtio-disk0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <disk type='file' device='cdrom'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <source file='/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.config' index='1'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <backingStore/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target dev='sda' bus='sata'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <readonly/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='sata0-0-0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='0' model='pcie-root'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pcie.0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='1' port='0x10'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.1'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='2' port='0x11'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.2'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='3' port='0x12'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.3'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='4' port='0x13'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.4'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='5' port='0x14'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.5'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='6' port='0x15'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.6'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='7' port='0x16'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.7'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='8' port='0x17'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.8'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='9' port='0x18'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.9'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='10' port='0x19'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.10'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='11' port='0x1a'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.11'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='12' port='0x1b'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.12'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='13' port='0x1c'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.13'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='14' port='0x1d'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.14'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='15' port='0x1e'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.15'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='16' port='0x1f'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.16'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='17' port='0x20'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.17'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='18' port='0x21'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.18'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='19' port='0x22'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.19'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='20' port='0x23'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.20'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='21' port='0x24'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.21'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='22' port='0x25'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.22'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='23' port='0x26'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.23'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='24' port='0x27'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.24'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='25' port='0x28'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.25'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-pci-bridge'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.26'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='usb'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='sata' index='0'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='ide'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <interface type='ethernet'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <mac address='fa:16:3e:bf:6a:c9'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target dev='tapdf57296c-8a'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model type='virtio'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <mtu size='1442'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='net0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <serial type='pty'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <source path='/dev/pts/0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <log file='/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/console.log' append='off'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target type='isa-serial' port='0'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:        <model name='isa-serial'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      </target>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='serial0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <console type='pty' tty='/dev/pts/0'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <source path='/dev/pts/0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <log file='/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/console.log' append='off'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target type='serial' port='0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='serial0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </console>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <input type='tablet' bus='usb'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='input0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='usb' bus='0' port='1'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <input type='mouse' bus='ps2'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='input1'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <input type='keyboard' bus='ps2'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='input2'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <listen type='address' address='::0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </graphics>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <audio id='1' type='none'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='video0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <watchdog model='itco' action='reset'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='watchdog0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </watchdog>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <memballoon model='virtio'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <stats period='10'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='balloon0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <rng model='virtio'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <backend model='random'>/dev/urandom</backend>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='rng0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <label>system_u:system_r:svirt_t:s0:c103,c879</label>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c103,c879</imagelabel>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </seclabel>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <label>+107:+107</label>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <imagelabel>+107:+107</imagelabel>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </seclabel>
Jan 31 01:42:06 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:42:06 np0005603500 nova_compute[182934]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.569 182938 DEBUG nova.virt.libvirt.guest [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:26:e4:61"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap33eb9351-99"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.573 182938 DEBUG nova.virt.libvirt.guest [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:26:e4:61"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap33eb9351-99"/></interface>not found in domain: <domain type='kvm' id='6'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <name>instance-00000006</name>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <uuid>8515b99e-89c2-4a03-9a14-d6a0c3dca692</uuid>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:name>tempest-TestNetworkBasicOps-server-3559488</nova:name>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:creationTime>2026-01-31 06:42:03</nova:creationTime>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:flavor name="m1.nano">
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:memory>128</nova:memory>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:disk>1</nova:disk>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:swap>0</nova:swap>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:vcpus>1</nova:vcpus>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </nova:flavor>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:owner>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </nova:owner>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:ports>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:port uuid="df57296c-8ac3-44b0-8fb6-9f85d0a93bdc">
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </nova:port>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </nova:ports>
Jan 31 01:42:06 np0005603500 nova_compute[182934]: </nova:instance>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <memory unit='KiB'>131072</memory>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <vcpu placement='static'>1</vcpu>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <resource>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <partition>/machine</partition>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </resource>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <sysinfo type='smbios'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <entry name='manufacturer'>RDO</entry>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <entry name='version'>31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <entry name='serial'>8515b99e-89c2-4a03-9a14-d6a0c3dca692</entry>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <entry name='uuid'>8515b99e-89c2-4a03-9a14-d6a0c3dca692</entry>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <entry name='family'>Virtual Machine</entry>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <boot dev='hd'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <smbios mode='sysinfo'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <vmcoreinfo state='on'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <cpu mode='custom' match='exact' check='full'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <model fallback='forbid'>EPYC-Rome</model>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <vendor>AMD</vendor>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='x2apic'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='tsc-deadline'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='hypervisor'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='tsc_adjust'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='spec-ctrl'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='stibp'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='ssbd'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='cmp_legacy'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='overflow-recov'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='succor'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='ibrs'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='amd-ssbd'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='virt-ssbd'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='lbrv'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='tsc-scale'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='vmcb-clean'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='flushbyasid'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='pause-filter'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='pfthreshold'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='svme-addr-chk'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='lfence-always-serializing'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='xsaves'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='svm'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='require' name='topoext'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='npt'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <feature policy='disable' name='nrip-save'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <clock offset='utc'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <timer name='hpet' present='no'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <on_poweroff>destroy</on_poweroff>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <on_reboot>restart</on_reboot>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <on_crash>destroy</on_crash>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <disk type='file' device='disk'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <source file='/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk' index='2'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <backingStore type='file' index='3'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:        <format type='raw'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:        <source file='/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:        <backingStore/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      </backingStore>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target dev='vda' bus='virtio'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='virtio-disk0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <disk type='file' device='cdrom'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <source file='/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/disk.config' index='1'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <backingStore/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target dev='sda' bus='sata'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <readonly/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='sata0-0-0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='0' model='pcie-root'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pcie.0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='1' port='0x10'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.1'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='2' port='0x11'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.2'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='3' port='0x12'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.3'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='4' port='0x13'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.4'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='5' port='0x14'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.5'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='6' port='0x15'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.6'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='7' port='0x16'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.7'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='8' port='0x17'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.8'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='9' port='0x18'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.9'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='10' port='0x19'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.10'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='11' port='0x1a'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.11'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='12' port='0x1b'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.12'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='13' port='0x1c'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.13'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='14' port='0x1d'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.14'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='15' port='0x1e'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.15'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='16' port='0x1f'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.16'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='17' port='0x20'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.17'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='18' port='0x21'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.18'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='19' port='0x22'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.19'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='20' port='0x23'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.20'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='21' port='0x24'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.21'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='22' port='0x25'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.22'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='23' port='0x26'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.23'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='24' port='0x27'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.24'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-root-port'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target chassis='25' port='0x28'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.25'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model name='pcie-pci-bridge'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='pci.26'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='usb'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <controller type='sata' index='0'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='ide'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </controller>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <interface type='ethernet'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <mac address='fa:16:3e:bf:6a:c9'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target dev='tapdf57296c-8a'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model type='virtio'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <mtu size='1442'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='net0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <serial type='pty'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <source path='/dev/pts/0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <log file='/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/console.log' append='off'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target type='isa-serial' port='0'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:        <model name='isa-serial'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      </target>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='serial0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <console type='pty' tty='/dev/pts/0'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <source path='/dev/pts/0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <log file='/var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692/console.log' append='off'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <target type='serial' port='0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='serial0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </console>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <input type='tablet' bus='usb'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='input0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='usb' bus='0' port='1'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <input type='mouse' bus='ps2'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='input1'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <input type='keyboard' bus='ps2'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='input2'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </input>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <listen type='address' address='::0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </graphics>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <audio id='1' type='none'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='video0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <watchdog model='itco' action='reset'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='watchdog0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </watchdog>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <memballoon model='virtio'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <stats period='10'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='balloon0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <rng model='virtio'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <backend model='random'>/dev/urandom</backend>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <alias name='rng0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <label>system_u:system_r:svirt_t:s0:c103,c879</label>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c103,c879</imagelabel>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </seclabel>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <label>+107:+107</label>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <imagelabel>+107:+107</imagelabel>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </seclabel>
Jan 31 01:42:06 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:42:06 np0005603500 nova_compute[182934]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.573 182938 WARNING nova.virt.libvirt.driver [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Detaching interface fa:16:3e:26:e4:61 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap33eb9351-99' not found.
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.574 182938 DEBUG nova.virt.libvirt.vif [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:39:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-3559488',display_name='tempest-TestNetworkBasicOps-server-3559488',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-3559488',id=6,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChYKDUjvhjCGs5+r/emnCrvtzxjRm0xGy2EtjvWZf5uSCdLdeFqDloA/lKvnHUNOd7mjbOXH0cnbc13Uy4w1Pw4Qnj3hkpXumiALlpO/vfZJol+WTWZCmQRTTzYN7UeJQ==',key_name='tempest-TestNetworkBasicOps-1595486681',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:39:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-gu629u0n',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:39:29Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=8515b99e-89c2-4a03-9a14-d6a0c3dca692,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "33eb9351-9906-4872-b960-8ca2037338a5", "address": "fa:16:3e:26:e4:61", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33eb9351-99", "ovs_interfaceid": "33eb9351-9906-4872-b960-8ca2037338a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.574 182938 DEBUG nova.network.os_vif_util [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Converting VIF {"id": "33eb9351-9906-4872-b960-8ca2037338a5", "address": "fa:16:3e:26:e4:61", "network": {"id": "8e0e18fe-0b80-4494-a6db-546b6daf5fd2", "bridge": "br-int", "label": "tempest-network-smoke--497575700", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33eb9351-99", "ovs_interfaceid": "33eb9351-9906-4872-b960-8ca2037338a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.574 182938 DEBUG nova.network.os_vif_util [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:26:e4:61,bridge_name='br-int',has_traffic_filtering=True,id=33eb9351-9906-4872-b960-8ca2037338a5,network=Network(8e0e18fe-0b80-4494-a6db-546b6daf5fd2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33eb9351-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.575 182938 DEBUG os_vif [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:e4:61,bridge_name='br-int',has_traffic_filtering=True,id=33eb9351-9906-4872-b960-8ca2037338a5,network=Network(8e0e18fe-0b80-4494-a6db-546b6daf5fd2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33eb9351-99') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.576 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.576 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap33eb9351-99, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.576 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.578 182938 INFO os_vif [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:e4:61,bridge_name='br-int',has_traffic_filtering=True,id=33eb9351-9906-4872-b960-8ca2037338a5,network=Network(8e0e18fe-0b80-4494-a6db-546b6daf5fd2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33eb9351-99')
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.579 182938 DEBUG nova.virt.driver [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-3559488', uuid='8515b99e-89c2-4a03-9a14-d6a0c3dca692'), owner=OwnerMeta(userid='dddc34b0385a49a5bd9bf081ed29e9fd', username='tempest-TestNetworkBasicOps-1355800406-project-member', projectid='829310cd8381494e96216dba067ff8d3', projectname='tempest-TestNetworkBasicOps-1355800406'), image=ImageMeta(id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus='sata',hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus='virtio',hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus='usb',hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type='q35',hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model='usbtablet',hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model='virtio',hw_video_ram=<?>,hw_vif_model='virtio',hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "address": "fa:16:3e:bf:6a:c9", "network": {"id": "719b4b2c-8a13-4225-a6bd-071da9c7ca99", "bridge": "br-int", "label": "tempest-network-smoke--720963365", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf57296c-8a", "ovs_interfaceid": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1769841726.5792966) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Jan 31 01:42:06 np0005603500 nova_compute[182934]: 2026-01-31 06:42:06.580 182938 DEBUG nova.virt.libvirt.guest [req-8ccb41df-fd41-42e5-b4fc-f23341e34807 req-963dead2-d829-4ecf-9876-654b14d7428f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:name>tempest-TestNetworkBasicOps-server-3559488</nova:name>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:creationTime>2026-01-31 06:42:06</nova:creationTime>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:flavor name="m1.nano">
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:memory>128</nova:memory>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:disk>1</nova:disk>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:swap>0</nova:swap>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:vcpus>1</nova:vcpus>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </nova:flavor>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:owner>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </nova:owner>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  <nova:ports>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    <nova:port uuid="df57296c-8ac3-44b0-8fb6-9f85d0a93bdc">
Jan 31 01:42:06 np0005603500 nova_compute[182934]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:    </nova:port>
Jan 31 01:42:06 np0005603500 nova_compute[182934]:  </nova:ports>
Jan 31 01:42:06 np0005603500 nova_compute[182934]: </nova:instance>
Jan 31 01:42:06 np0005603500 nova_compute[182934]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:356
Jan 31 01:42:07 np0005603500 nova_compute[182934]: 2026-01-31 06:42:07.114 182938 DEBUG nova.compute.manager [req-3fbc05c6-1039-4912-85f1-992fc0414343 req-953f5e74-942b-4976-a531-42e7809e148e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received event network-changed-df57296c-8ac3-44b0-8fb6-9f85d0a93bdc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:42:07 np0005603500 nova_compute[182934]: 2026-01-31 06:42:07.115 182938 DEBUG nova.compute.manager [req-3fbc05c6-1039-4912-85f1-992fc0414343 req-953f5e74-942b-4976-a531-42e7809e148e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Refreshing instance network info cache due to event network-changed-df57296c-8ac3-44b0-8fb6-9f85d0a93bdc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:42:07 np0005603500 nova_compute[182934]: 2026-01-31 06:42:07.115 182938 DEBUG oslo_concurrency.lockutils [req-3fbc05c6-1039-4912-85f1-992fc0414343 req-953f5e74-942b-4976-a531-42e7809e148e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:42:07 np0005603500 podman[216352]: 2026-01-31 06:42:07.166364097 +0000 UTC m=+0.085099976 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 31 01:42:07 np0005603500 nova_compute[182934]: 2026-01-31 06:42:07.695 182938 DEBUG oslo_concurrency.lockutils [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:42:07 np0005603500 nova_compute[182934]: 2026-01-31 06:42:07.696 182938 DEBUG oslo_concurrency.lockutils [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:42:07 np0005603500 nova_compute[182934]: 2026-01-31 06:42:07.696 182938 DEBUG oslo_concurrency.lockutils [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:42:07 np0005603500 nova_compute[182934]: 2026-01-31 06:42:07.696 182938 DEBUG oslo_concurrency.lockutils [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:42:07 np0005603500 nova_compute[182934]: 2026-01-31 06:42:07.697 182938 DEBUG oslo_concurrency.lockutils [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:42:07 np0005603500 nova_compute[182934]: 2026-01-31 06:42:07.698 182938 INFO nova.compute.manager [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Terminating instance
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.207 182938 DEBUG nova.compute.manager [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Jan 31 01:42:08 np0005603500 kernel: tapdf57296c-8a (unregistering): left promiscuous mode
Jan 31 01:42:08 np0005603500 NetworkManager[55506]: <info>  [1769841728.2349] device (tapdf57296c-8a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 01:42:08 np0005603500 ovn_controller[95398]: 2026-01-31T06:42:08Z|00122|binding|INFO|Releasing lport df57296c-8ac3-44b0-8fb6-9f85d0a93bdc from this chassis (sb_readonly=0)
Jan 31 01:42:08 np0005603500 ovn_controller[95398]: 2026-01-31T06:42:08Z|00123|binding|INFO|Setting lport df57296c-8ac3-44b0-8fb6-9f85d0a93bdc down in Southbound
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.239 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:08 np0005603500 ovn_controller[95398]: 2026-01-31T06:42:08Z|00124|binding|INFO|Removing iface tapdf57296c-8a ovn-installed in OVS
Jan 31 01:42:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:08.246 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:6a:c9 10.100.0.14'], port_security=['fa:16:3e:bf:6a:c9 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '8515b99e-89c2-4a03-9a14-d6a0c3dca692', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-719b4b2c-8a13-4225-a6bd-071da9c7ca99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '6059e68c-416d-405d-841f-1d281e86dc7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ef63d3b7-bb4b-4593-95e1-75f2cdd31d5e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=df57296c-8ac3-44b0-8fb6-9f85d0a93bdc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:42:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:08.247 104644 INFO neutron.agent.ovn.metadata.agent [-] Port df57296c-8ac3-44b0-8fb6-9f85d0a93bdc in datapath 719b4b2c-8a13-4225-a6bd-071da9c7ca99 unbound from our chassis
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.248 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:08.248 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 719b4b2c-8a13-4225-a6bd-071da9c7ca99, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:42:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:08.249 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[1607b203-7fb0-4a39-9200-c442f95f92b6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:42:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:08.250 104644 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99 namespace which is not needed anymore
Jan 31 01:42:08 np0005603500 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Deactivated successfully.
Jan 31 01:42:08 np0005603500 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Consumed 19.806s CPU time.
Jan 31 01:42:08 np0005603500 systemd-machined[154375]: Machine qemu-6-instance-00000006 terminated.
Jan 31 01:42:08 np0005603500 neutron-haproxy-ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99[215326]: [NOTICE]   (215330) : haproxy version is 2.8.14-c23fe91
Jan 31 01:42:08 np0005603500 podman[216401]: 2026-01-31 06:42:08.345665251 +0000 UTC m=+0.022838870 container kill 0ed8827d1ab41d4cb3ceb10efd454f5157529fdc45b98227d53a4dba5cd37ff2 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99, tcib_managed=true, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:42:08 np0005603500 neutron-haproxy-ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99[215326]: [NOTICE]   (215330) : path to executable is /usr/sbin/haproxy
Jan 31 01:42:08 np0005603500 neutron-haproxy-ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99[215326]: [WARNING]  (215330) : Exiting Master process...
Jan 31 01:42:08 np0005603500 neutron-haproxy-ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99[215326]: [ALERT]    (215330) : Current worker (215332) exited with code 143 (Terminated)
Jan 31 01:42:08 np0005603500 neutron-haproxy-ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99[215326]: [WARNING]  (215330) : All workers exited. Exiting... (0)
Jan 31 01:42:08 np0005603500 systemd[1]: libpod-0ed8827d1ab41d4cb3ceb10efd454f5157529fdc45b98227d53a4dba5cd37ff2.scope: Deactivated successfully.
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.452 182938 INFO nova.virt.libvirt.driver [-] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Instance destroyed successfully.
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.453 182938 DEBUG nova.objects.instance [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'resources' on Instance uuid 8515b99e-89c2-4a03-9a14-d6a0c3dca692 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:42:08 np0005603500 podman[216416]: 2026-01-31 06:42:08.561508969 +0000 UTC m=+0.206164180 container died 0ed8827d1ab41d4cb3ceb10efd454f5157529fdc45b98227d53a4dba5cd37ff2 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127)
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.700 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.724 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.971 182938 DEBUG nova.virt.libvirt.vif [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:39:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-3559488',display_name='tempest-TestNetworkBasicOps-server-3559488',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-3559488',id=6,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChYKDUjvhjCGs5+r/emnCrvtzxjRm0xGy2EtjvWZf5uSCdLdeFqDloA/lKvnHUNOd7mjbOXH0cnbc13Uy4w1Pw4Qnj3hkpXumiALlpO/vfZJol+WTWZCmQRTTzYN7UeJQ==',key_name='tempest-TestNetworkBasicOps-1595486681',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:39:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-gu629u0n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:39:29Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=8515b99e-89c2-4a03-9a14-d6a0c3dca692,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "address": "fa:16:3e:bf:6a:c9", "network": {"id": "719b4b2c-8a13-4225-a6bd-071da9c7ca99", "bridge": "br-int", "label": "tempest-network-smoke--720963365", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf57296c-8a", "ovs_interfaceid": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.971 182938 DEBUG nova.network.os_vif_util [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "address": "fa:16:3e:bf:6a:c9", "network": {"id": "719b4b2c-8a13-4225-a6bd-071da9c7ca99", "bridge": "br-int", "label": "tempest-network-smoke--720963365", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf57296c-8a", "ovs_interfaceid": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.972 182938 DEBUG nova.network.os_vif_util [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bf:6a:c9,bridge_name='br-int',has_traffic_filtering=True,id=df57296c-8ac3-44b0-8fb6-9f85d0a93bdc,network=Network(719b4b2c-8a13-4225-a6bd-071da9c7ca99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf57296c-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.972 182938 DEBUG os_vif [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:6a:c9,bridge_name='br-int',has_traffic_filtering=True,id=df57296c-8ac3-44b0-8fb6-9f85d0a93bdc,network=Network(719b4b2c-8a13-4225-a6bd-071da9c7ca99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf57296c-8a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.974 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.974 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf57296c-8a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.975 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.978 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.978 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.979 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=61772f54-5353-45af-a365-cfa53eba2e42) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.979 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.981 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.983 182938 INFO os_vif [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:6a:c9,bridge_name='br-int',has_traffic_filtering=True,id=df57296c-8ac3-44b0-8fb6-9f85d0a93bdc,network=Network(719b4b2c-8a13-4225-a6bd-071da9c7ca99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf57296c-8a')
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.984 182938 INFO nova.virt.libvirt.driver [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Deleting instance files /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692_del
Jan 31 01:42:08 np0005603500 nova_compute[182934]: 2026-01-31 06:42:08.985 182938 INFO nova.virt.libvirt.driver [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Deletion of /var/lib/nova/instances/8515b99e-89c2-4a03-9a14-d6a0c3dca692_del complete
Jan 31 01:42:09 np0005603500 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0ed8827d1ab41d4cb3ceb10efd454f5157529fdc45b98227d53a4dba5cd37ff2-userdata-shm.mount: Deactivated successfully.
Jan 31 01:42:09 np0005603500 systemd[1]: var-lib-containers-storage-overlay-dfc1cc840a7aff4c3b5038ce5865bdb6f5903485bbe17a8486b932aea2129e7c-merged.mount: Deactivated successfully.
Jan 31 01:42:09 np0005603500 nova_compute[182934]: 2026-01-31 06:42:09.372 182938 DEBUG nova.compute.manager [req-2466494d-5890-49a4-a199-f6e98696e19f req-20edc63f-1640-4945-a4ec-a7cd749f1530 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received event network-vif-unplugged-df57296c-8ac3-44b0-8fb6-9f85d0a93bdc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:42:09 np0005603500 nova_compute[182934]: 2026-01-31 06:42:09.372 182938 DEBUG oslo_concurrency.lockutils [req-2466494d-5890-49a4-a199-f6e98696e19f req-20edc63f-1640-4945-a4ec-a7cd749f1530 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:42:09 np0005603500 nova_compute[182934]: 2026-01-31 06:42:09.372 182938 DEBUG oslo_concurrency.lockutils [req-2466494d-5890-49a4-a199-f6e98696e19f req-20edc63f-1640-4945-a4ec-a7cd749f1530 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:42:09 np0005603500 nova_compute[182934]: 2026-01-31 06:42:09.373 182938 DEBUG oslo_concurrency.lockutils [req-2466494d-5890-49a4-a199-f6e98696e19f req-20edc63f-1640-4945-a4ec-a7cd749f1530 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:42:09 np0005603500 nova_compute[182934]: 2026-01-31 06:42:09.373 182938 DEBUG nova.compute.manager [req-2466494d-5890-49a4-a199-f6e98696e19f req-20edc63f-1640-4945-a4ec-a7cd749f1530 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] No waiting events found dispatching network-vif-unplugged-df57296c-8ac3-44b0-8fb6-9f85d0a93bdc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:42:09 np0005603500 nova_compute[182934]: 2026-01-31 06:42:09.373 182938 DEBUG nova.compute.manager [req-2466494d-5890-49a4-a199-f6e98696e19f req-20edc63f-1640-4945-a4ec-a7cd749f1530 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received event network-vif-unplugged-df57296c-8ac3-44b0-8fb6-9f85d0a93bdc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Jan 31 01:42:09 np0005603500 nova_compute[182934]: 2026-01-31 06:42:09.374 182938 DEBUG nova.compute.manager [req-2466494d-5890-49a4-a199-f6e98696e19f req-20edc63f-1640-4945-a4ec-a7cd749f1530 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received event network-vif-plugged-df57296c-8ac3-44b0-8fb6-9f85d0a93bdc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:42:09 np0005603500 nova_compute[182934]: 2026-01-31 06:42:09.374 182938 DEBUG oslo_concurrency.lockutils [req-2466494d-5890-49a4-a199-f6e98696e19f req-20edc63f-1640-4945-a4ec-a7cd749f1530 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:42:09 np0005603500 nova_compute[182934]: 2026-01-31 06:42:09.374 182938 DEBUG oslo_concurrency.lockutils [req-2466494d-5890-49a4-a199-f6e98696e19f req-20edc63f-1640-4945-a4ec-a7cd749f1530 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:42:09 np0005603500 nova_compute[182934]: 2026-01-31 06:42:09.375 182938 DEBUG oslo_concurrency.lockutils [req-2466494d-5890-49a4-a199-f6e98696e19f req-20edc63f-1640-4945-a4ec-a7cd749f1530 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:42:09 np0005603500 nova_compute[182934]: 2026-01-31 06:42:09.375 182938 DEBUG nova.compute.manager [req-2466494d-5890-49a4-a199-f6e98696e19f req-20edc63f-1640-4945-a4ec-a7cd749f1530 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] No waiting events found dispatching network-vif-plugged-df57296c-8ac3-44b0-8fb6-9f85d0a93bdc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:42:09 np0005603500 nova_compute[182934]: 2026-01-31 06:42:09.375 182938 WARNING nova.compute.manager [req-2466494d-5890-49a4-a199-f6e98696e19f req-20edc63f-1640-4945-a4ec-a7cd749f1530 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received unexpected event network-vif-plugged-df57296c-8ac3-44b0-8fb6-9f85d0a93bdc for instance with vm_state active and task_state deleting.
Jan 31 01:42:09 np0005603500 nova_compute[182934]: 2026-01-31 06:42:09.498 182938 INFO nova.compute.manager [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Took 1.29 seconds to destroy the instance on the hypervisor.
Jan 31 01:42:09 np0005603500 nova_compute[182934]: 2026-01-31 06:42:09.498 182938 DEBUG oslo.service.backend.eventlet.loopingcall [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Jan 31 01:42:09 np0005603500 nova_compute[182934]: 2026-01-31 06:42:09.498 182938 DEBUG nova.compute.manager [-] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Jan 31 01:42:09 np0005603500 nova_compute[182934]: 2026-01-31 06:42:09.499 182938 DEBUG nova.network.neutron [-] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Jan 31 01:42:09 np0005603500 podman[216416]: 2026-01-31 06:42:09.553109142 +0000 UTC m=+1.197764373 container cleanup 0ed8827d1ab41d4cb3ceb10efd454f5157529fdc45b98227d53a4dba5cd37ff2 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:42:09 np0005603500 systemd[1]: libpod-conmon-0ed8827d1ab41d4cb3ceb10efd454f5157529fdc45b98227d53a4dba5cd37ff2.scope: Deactivated successfully.
Jan 31 01:42:10 np0005603500 podman[216444]: 2026-01-31 06:42:10.171778815 +0000 UTC m=+1.619570724 container remove 0ed8827d1ab41d4cb3ceb10efd454f5157529fdc45b98227d53a4dba5cd37ff2 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:42:10 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:10.175 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[4dbdb95d-44b0-4d66-ba2e-6e907b8e4056]: (4, ("Sat Jan 31 06:42:08 AM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99 (0ed8827d1ab41d4cb3ceb10efd454f5157529fdc45b98227d53a4dba5cd37ff2)\n0ed8827d1ab41d4cb3ceb10efd454f5157529fdc45b98227d53a4dba5cd37ff2\nSat Jan 31 06:42:08 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99 (0ed8827d1ab41d4cb3ceb10efd454f5157529fdc45b98227d53a4dba5cd37ff2)\n0ed8827d1ab41d4cb3ceb10efd454f5157529fdc45b98227d53a4dba5cd37ff2\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:42:10 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:10.177 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[b3e4a235-56ea-4f2e-b3db-5cd1ce01681e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:42:10 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:10.178 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/719b4b2c-8a13-4225-a6bd-071da9c7ca99.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/719b4b2c-8a13-4225-a6bd-071da9c7ca99.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:42:10 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:10.178 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[4526004f-87c1-4fab-8797-bd0916396273]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:42:10 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:10.179 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap719b4b2c-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:42:10 np0005603500 nova_compute[182934]: 2026-01-31 06:42:10.181 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:10 np0005603500 kernel: tap719b4b2c-80: left promiscuous mode
Jan 31 01:42:10 np0005603500 nova_compute[182934]: 2026-01-31 06:42:10.186 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:10 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:10.189 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[277ea079-a512-40b4-a0af-5119a8268478]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:42:10 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:10.201 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[6b89c2bc-dffe-47fd-b8be-6235af36155b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:42:10 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:10.202 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[954b76a5-d0f2-4461-b2ff-f956b2cf686c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:42:10 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:10.215 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[08779b63-e0c6-4298-a3c7-27015fb9c131]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387864, 'reachable_time': 15111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216464, 'error': None, 'target': 'ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:42:10 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:10.217 105168 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-719b4b2c-8a13-4225-a6bd-071da9c7ca99 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 31 01:42:10 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:10.218 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[e34ba07f-1bcc-4c12-9275-25666b9e3cef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:42:10 np0005603500 systemd[1]: run-netns-ovnmeta\x2d719b4b2c\x2d8a13\x2d4225\x2da6bd\x2d071da9c7ca99.mount: Deactivated successfully.
Jan 31 01:42:10 np0005603500 nova_compute[182934]: 2026-01-31 06:42:10.806 182938 DEBUG nova.network.neutron [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Updating instance_info_cache with network_info: [{"id": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "address": "fa:16:3e:bf:6a:c9", "network": {"id": "719b4b2c-8a13-4225-a6bd-071da9c7ca99", "bridge": "br-int", "label": "tempest-network-smoke--720963365", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf57296c-8a", "ovs_interfaceid": "df57296c-8ac3-44b0-8fb6-9f85d0a93bdc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:42:11 np0005603500 nova_compute[182934]: 2026-01-31 06:42:11.321 182938 DEBUG oslo_concurrency.lockutils [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Releasing lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:42:11 np0005603500 nova_compute[182934]: 2026-01-31 06:42:11.323 182938 DEBUG oslo_concurrency.lockutils [req-3fbc05c6-1039-4912-85f1-992fc0414343 req-953f5e74-942b-4976-a531-42e7809e148e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:42:11 np0005603500 nova_compute[182934]: 2026-01-31 06:42:11.324 182938 DEBUG nova.network.neutron [req-3fbc05c6-1039-4912-85f1-992fc0414343 req-953f5e74-942b-4976-a531-42e7809e148e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Refreshing network info cache for port df57296c-8ac3-44b0-8fb6-9f85d0a93bdc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:42:11 np0005603500 nova_compute[182934]: 2026-01-31 06:42:11.613 182938 DEBUG nova.compute.manager [req-9fb79c83-288b-47b7-9f01-8cf88d6ce365 req-324b5bff-4f39-48f3-b07b-a92563ebf00b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Received event network-vif-deleted-df57296c-8ac3-44b0-8fb6-9f85d0a93bdc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:42:11 np0005603500 nova_compute[182934]: 2026-01-31 06:42:11.613 182938 INFO nova.compute.manager [req-9fb79c83-288b-47b7-9f01-8cf88d6ce365 req-324b5bff-4f39-48f3-b07b-a92563ebf00b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Neutron deleted interface df57296c-8ac3-44b0-8fb6-9f85d0a93bdc; detaching it from the instance and deleting it from the info cache
Jan 31 01:42:11 np0005603500 nova_compute[182934]: 2026-01-31 06:42:11.614 182938 DEBUG nova.network.neutron [req-9fb79c83-288b-47b7-9f01-8cf88d6ce365 req-324b5bff-4f39-48f3-b07b-a92563ebf00b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:42:11 np0005603500 nova_compute[182934]: 2026-01-31 06:42:11.853 182938 DEBUG nova.network.neutron [-] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:42:11 np0005603500 nova_compute[182934]: 2026-01-31 06:42:11.858 182938 DEBUG oslo_concurrency.lockutils [None req-06a6446c-1f1c-44e2-a651-030111fde3a1 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "interface-8515b99e-89c2-4a03-9a14-d6a0c3dca692-33eb9351-9906-4872-b960-8ca2037338a5" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 29.462s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:42:12 np0005603500 nova_compute[182934]: 2026-01-31 06:42:12.122 182938 DEBUG nova.compute.manager [req-9fb79c83-288b-47b7-9f01-8cf88d6ce365 req-324b5bff-4f39-48f3-b07b-a92563ebf00b 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Detach interface failed, port_id=df57296c-8ac3-44b0-8fb6-9f85d0a93bdc, reason: Instance 8515b99e-89c2-4a03-9a14-d6a0c3dca692 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11571
Jan 31 01:42:12 np0005603500 nova_compute[182934]: 2026-01-31 06:42:12.361 182938 INFO nova.compute.manager [-] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Took 2.86 seconds to deallocate network for instance.
Jan 31 01:42:12 np0005603500 nova_compute[182934]: 2026-01-31 06:42:12.788 182938 INFO nova.network.neutron [req-3fbc05c6-1039-4912-85f1-992fc0414343 req-953f5e74-942b-4976-a531-42e7809e148e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Port df57296c-8ac3-44b0-8fb6-9f85d0a93bdc from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 31 01:42:12 np0005603500 nova_compute[182934]: 2026-01-31 06:42:12.789 182938 DEBUG nova.network.neutron [req-3fbc05c6-1039-4912-85f1-992fc0414343 req-953f5e74-942b-4976-a531-42e7809e148e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 8515b99e-89c2-4a03-9a14-d6a0c3dca692] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:42:12 np0005603500 nova_compute[182934]: 2026-01-31 06:42:12.871 182938 DEBUG oslo_concurrency.lockutils [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:42:12 np0005603500 nova_compute[182934]: 2026-01-31 06:42:12.872 182938 DEBUG oslo_concurrency.lockutils [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:42:12 np0005603500 nova_compute[182934]: 2026-01-31 06:42:12.924 182938 DEBUG nova.compute.provider_tree [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:42:13 np0005603500 nova_compute[182934]: 2026-01-31 06:42:13.295 182938 DEBUG oslo_concurrency.lockutils [req-3fbc05c6-1039-4912-85f1-992fc0414343 req-953f5e74-942b-4976-a531-42e7809e148e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-8515b99e-89c2-4a03-9a14-d6a0c3dca692" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:42:13 np0005603500 nova_compute[182934]: 2026-01-31 06:42:13.432 182938 DEBUG nova.scheduler.client.report [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:42:13 np0005603500 nova_compute[182934]: 2026-01-31 06:42:13.725 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:13 np0005603500 nova_compute[182934]: 2026-01-31 06:42:13.942 182938 DEBUG oslo_concurrency.lockutils [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.070s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:42:13 np0005603500 nova_compute[182934]: 2026-01-31 06:42:13.965 182938 INFO nova.scheduler.client.report [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Deleted allocations for instance 8515b99e-89c2-4a03-9a14-d6a0c3dca692
Jan 31 01:42:13 np0005603500 nova_compute[182934]: 2026-01-31 06:42:13.980 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:14 np0005603500 podman[216465]: 2026-01-31 06:42:14.149463448 +0000 UTC m=+0.073235278 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 31 01:42:14 np0005603500 nova_compute[182934]: 2026-01-31 06:42:14.984 182938 DEBUG oslo_concurrency.lockutils [None req-902d4992-1365-46e4-9ca3-327ec790e5e2 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "8515b99e-89c2-4a03-9a14-d6a0c3dca692" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.289s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:42:15 np0005603500 podman[216486]: 2026-01-31 06:42:15.114731121 +0000 UTC m=+0.039096708 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.983 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f199f44d6a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f199f44d940>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.985 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f199f44d220>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f199f436bb0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f199f44d3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f199f44d040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f199f44d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f199f43b700>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f199f44d160>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f199f43b550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f199f43bdf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f199f43bca0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f199f44d2e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f199f43bbe0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f199f43b3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f199f43b340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f199f44dc10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f199f43baf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f199f44db50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f199f44d760>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f199f43b0d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f19a53f3b80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f199f451250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f199f44dcd0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f199f43b490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f199f44d4f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:42:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:42:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:42:18 np0005603500 nova_compute[182934]: 2026-01-31 06:42:18.727 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:19 np0005603500 nova_compute[182934]: 2026-01-31 06:42:19.077 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:20 np0005603500 nova_compute[182934]: 2026-01-31 06:42:20.491 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:20 np0005603500 nova_compute[182934]: 2026-01-31 06:42:20.509 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:23 np0005603500 nova_compute[182934]: 2026-01-31 06:42:23.728 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:24 np0005603500 nova_compute[182934]: 2026-01-31 06:42:24.079 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:28 np0005603500 podman[216513]: 2026-01-31 06:42:28.123523982 +0000 UTC m=+0.042899600 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 31 01:42:28 np0005603500 podman[216512]: 2026-01-31 06:42:28.14730757 +0000 UTC m=+0.069068514 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 31 01:42:28 np0005603500 nova_compute[182934]: 2026-01-31 06:42:28.730 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:29 np0005603500 nova_compute[182934]: 2026-01-31 06:42:29.081 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:30.809 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:fd:9c 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f945b6f3-ac5f-4949-a595-265a9c245851', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=074897e1-acb2-45fc-95e8-d7347eaae0a8, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=1539f259-b2fb-49d9-b369-ffe8f622864d) old=Port_Binding(mac=['fa:16:3e:9b:fd:9c'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f945b6f3-ac5f-4949-a595-265a9c245851', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:42:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:30.810 104644 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 1539f259-b2fb-49d9-b369-ffe8f622864d in datapath f945b6f3-ac5f-4949-a595-265a9c245851 updated
Jan 31 01:42:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:30.812 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f945b6f3-ac5f-4949-a595-265a9c245851, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:42:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:30.812 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[e8b4eed6-1ad7-46d0-9030-09df594d7e75]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:42:33 np0005603500 nova_compute[182934]: 2026-01-31 06:42:33.732 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:33 np0005603500 nova_compute[182934]: 2026-01-31 06:42:33.761 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:33.763 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:42:33 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:33.764 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:42:34 np0005603500 nova_compute[182934]: 2026-01-31 06:42:34.083 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:34 np0005603500 nova_compute[182934]: 2026-01-31 06:42:34.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:42:36 np0005603500 nova_compute[182934]: 2026-01-31 06:42:36.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:42:37 np0005603500 podman[216554]: 2026-01-31 06:42:37.127307217 +0000 UTC m=+0.049968766 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1769056855, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, container_name=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 31 01:42:37 np0005603500 nova_compute[182934]: 2026-01-31 06:42:37.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:42:37 np0005603500 nova_compute[182934]: 2026-01-31 06:42:37.148 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:42:37 np0005603500 nova_compute[182934]: 2026-01-31 06:42:37.148 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:42:37 np0005603500 nova_compute[182934]: 2026-01-31 06:42:37.666 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:42:37 np0005603500 nova_compute[182934]: 2026-01-31 06:42:37.667 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:42:37 np0005603500 nova_compute[182934]: 2026-01-31 06:42:37.667 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:42:37 np0005603500 nova_compute[182934]: 2026-01-31 06:42:37.667 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:42:37 np0005603500 podman[216575]: 2026-01-31 06:42:37.767527408 +0000 UTC m=+0.062305400 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 31 01:42:37 np0005603500 nova_compute[182934]: 2026-01-31 06:42:37.806 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:42:37 np0005603500 nova_compute[182934]: 2026-01-31 06:42:37.807 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5770MB free_disk=73.21212768554688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:42:37 np0005603500 nova_compute[182934]: 2026-01-31 06:42:37.807 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:42:37 np0005603500 nova_compute[182934]: 2026-01-31 06:42:37.807 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:42:38 np0005603500 nova_compute[182934]: 2026-01-31 06:42:38.733 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:38 np0005603500 nova_compute[182934]: 2026-01-31 06:42:38.851 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:42:38 np0005603500 nova_compute[182934]: 2026-01-31 06:42:38.852 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:42:38 np0005603500 nova_compute[182934]: 2026-01-31 06:42:38.872 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:42:39 np0005603500 nova_compute[182934]: 2026-01-31 06:42:39.085 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:39 np0005603500 nova_compute[182934]: 2026-01-31 06:42:39.382 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:42:39 np0005603500 nova_compute[182934]: 2026-01-31 06:42:39.895 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:42:39 np0005603500 nova_compute[182934]: 2026-01-31 06:42:39.896 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:42:40 np0005603500 nova_compute[182934]: 2026-01-31 06:42:40.896 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:42:40 np0005603500 nova_compute[182934]: 2026-01-31 06:42:40.897 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:42:41 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:41.765 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:42:42 np0005603500 nova_compute[182934]: 2026-01-31 06:42:42.143 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:42:43 np0005603500 nova_compute[182934]: 2026-01-31 06:42:43.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:42:43 np0005603500 nova_compute[182934]: 2026-01-31 06:42:43.735 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:44 np0005603500 nova_compute[182934]: 2026-01-31 06:42:44.088 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:45 np0005603500 podman[216601]: 2026-01-31 06:42:45.123989304 +0000 UTC m=+0.046806944 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127)
Jan 31 01:42:45 np0005603500 podman[216622]: 2026-01-31 06:42:45.197637025 +0000 UTC m=+0.047168666 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 31 01:42:48 np0005603500 nova_compute[182934]: 2026-01-31 06:42:48.738 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:49 np0005603500 nova_compute[182934]: 2026-01-31 06:42:49.089 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:50 np0005603500 nova_compute[182934]: 2026-01-31 06:42:50.572 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "25a251c7-ba38-4995-b170-837a778f9e89" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:42:50 np0005603500 nova_compute[182934]: 2026-01-31 06:42:50.572 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "25a251c7-ba38-4995-b170-837a778f9e89" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:42:51 np0005603500 nova_compute[182934]: 2026-01-31 06:42:51.077 182938 DEBUG nova.compute.manager [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Jan 31 01:42:51 np0005603500 nova_compute[182934]: 2026-01-31 06:42:51.635 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:42:51 np0005603500 nova_compute[182934]: 2026-01-31 06:42:51.636 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:42:51 np0005603500 nova_compute[182934]: 2026-01-31 06:42:51.645 182938 DEBUG nova.virt.hardware [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Jan 31 01:42:51 np0005603500 nova_compute[182934]: 2026-01-31 06:42:51.646 182938 INFO nova.compute.claims [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Claim successful on node compute-0.ctlplane.example.com
Jan 31 01:42:52 np0005603500 nova_compute[182934]: 2026-01-31 06:42:52.721 182938 DEBUG nova.compute.provider_tree [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:42:53 np0005603500 nova_compute[182934]: 2026-01-31 06:42:53.233 182938 DEBUG nova.scheduler.client.report [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:42:53 np0005603500 nova_compute[182934]: 2026-01-31 06:42:53.741 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:53 np0005603500 nova_compute[182934]: 2026-01-31 06:42:53.744 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:42:53 np0005603500 nova_compute[182934]: 2026-01-31 06:42:53.745 182938 DEBUG nova.compute.manager [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Jan 31 01:42:54 np0005603500 nova_compute[182934]: 2026-01-31 06:42:54.090 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:54 np0005603500 nova_compute[182934]: 2026-01-31 06:42:54.257 182938 DEBUG nova.compute.manager [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Jan 31 01:42:54 np0005603500 nova_compute[182934]: 2026-01-31 06:42:54.257 182938 DEBUG nova.network.neutron [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Jan 31 01:42:54 np0005603500 nova_compute[182934]: 2026-01-31 06:42:54.764 182938 INFO nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 31 01:42:55 np0005603500 nova_compute[182934]: 2026-01-31 06:42:55.157 182938 DEBUG nova.policy [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '829310cd8381494e96216dba067ff8d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Jan 31 01:42:55 np0005603500 nova_compute[182934]: 2026-01-31 06:42:55.273 182938 DEBUG nova.compute.manager [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Jan 31 01:42:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:56.092 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:42:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:56.092 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:42:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:42:56.092 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.297 182938 DEBUG nova.compute.manager [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.298 182938 DEBUG nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.298 182938 INFO nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Creating image(s)
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.299 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "/var/lib/nova/instances/25a251c7-ba38-4995-b170-837a778f9e89/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.299 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/25a251c7-ba38-4995-b170-837a778f9e89/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.300 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/25a251c7-ba38-4995-b170-837a778f9e89/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.300 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.304 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.305 182938 DEBUG oslo_concurrency.processutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.349 182938 DEBUG oslo_concurrency.processutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.350 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "d9035e96dc857b84194c2a2b496d294827e2de39" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.350 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.351 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.355 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.355 182938 DEBUG oslo_concurrency.processutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.401 182938 DEBUG oslo_concurrency.processutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.402 182938 DEBUG oslo_concurrency.processutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/25a251c7-ba38-4995-b170-837a778f9e89/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.422 182938 DEBUG oslo_concurrency.processutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/25a251c7-ba38-4995-b170-837a778f9e89/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.423 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.424 182938 DEBUG oslo_concurrency.processutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.465 182938 DEBUG oslo_concurrency.processutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.465 182938 DEBUG nova.virt.disk.api [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Checking if we can resize image /var/lib/nova/instances/25a251c7-ba38-4995-b170-837a778f9e89/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.466 182938 DEBUG oslo_concurrency.processutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/25a251c7-ba38-4995-b170-837a778f9e89/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.506 182938 DEBUG oslo_concurrency.processutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/25a251c7-ba38-4995-b170-837a778f9e89/disk --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.507 182938 DEBUG nova.virt.disk.api [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Cannot resize image /var/lib/nova/instances/25a251c7-ba38-4995-b170-837a778f9e89/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.507 182938 DEBUG nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.508 182938 DEBUG nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Ensure instance console log exists: /var/lib/nova/instances/25a251c7-ba38-4995-b170-837a778f9e89/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.508 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.508 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:42:56 np0005603500 nova_compute[182934]: 2026-01-31 06:42:56.508 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:42:57 np0005603500 nova_compute[182934]: 2026-01-31 06:42:57.341 182938 DEBUG nova.network.neutron [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Successfully updated port: 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 31 01:42:57 np0005603500 nova_compute[182934]: 2026-01-31 06:42:57.602 182938 DEBUG nova.compute.manager [req-610e7c2b-457a-49db-a3c0-711e7cc4d7f0 req-d1d3551f-9f8d-4a89-b164-1f8bedcabf56 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Received event network-changed-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:42:57 np0005603500 nova_compute[182934]: 2026-01-31 06:42:57.602 182938 DEBUG nova.compute.manager [req-610e7c2b-457a-49db-a3c0-711e7cc4d7f0 req-d1d3551f-9f8d-4a89-b164-1f8bedcabf56 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Refreshing instance network info cache due to event network-changed-12d11bf4-59e7-416e-87e4-7fa01bcbfb01. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:42:57 np0005603500 nova_compute[182934]: 2026-01-31 06:42:57.603 182938 DEBUG oslo_concurrency.lockutils [req-610e7c2b-457a-49db-a3c0-711e7cc4d7f0 req-d1d3551f-9f8d-4a89-b164-1f8bedcabf56 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-25a251c7-ba38-4995-b170-837a778f9e89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:42:57 np0005603500 nova_compute[182934]: 2026-01-31 06:42:57.603 182938 DEBUG oslo_concurrency.lockutils [req-610e7c2b-457a-49db-a3c0-711e7cc4d7f0 req-d1d3551f-9f8d-4a89-b164-1f8bedcabf56 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-25a251c7-ba38-4995-b170-837a778f9e89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:42:57 np0005603500 nova_compute[182934]: 2026-01-31 06:42:57.603 182938 DEBUG nova.network.neutron [req-610e7c2b-457a-49db-a3c0-711e7cc4d7f0 req-d1d3551f-9f8d-4a89-b164-1f8bedcabf56 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Refreshing network info cache for port 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:42:57 np0005603500 nova_compute[182934]: 2026-01-31 06:42:57.846 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "refresh_cache-25a251c7-ba38-4995-b170-837a778f9e89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:42:58 np0005603500 nova_compute[182934]: 2026-01-31 06:42:58.743 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:58 np0005603500 nova_compute[182934]: 2026-01-31 06:42:58.825 182938 DEBUG nova.network.neutron [req-610e7c2b-457a-49db-a3c0-711e7cc4d7f0 req-d1d3551f-9f8d-4a89-b164-1f8bedcabf56 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:42:59 np0005603500 nova_compute[182934]: 2026-01-31 06:42:59.091 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:42:59 np0005603500 podman[216664]: 2026-01-31 06:42:59.118160869 +0000 UTC m=+0.037133885 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 31 01:42:59 np0005603500 podman[216663]: 2026-01-31 06:42:59.149903766 +0000 UTC m=+0.069598725 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 31 01:43:00 np0005603500 nova_compute[182934]: 2026-01-31 06:43:00.155 182938 DEBUG nova.network.neutron [req-610e7c2b-457a-49db-a3c0-711e7cc4d7f0 req-d1d3551f-9f8d-4a89-b164-1f8bedcabf56 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:43:00 np0005603500 nova_compute[182934]: 2026-01-31 06:43:00.663 182938 DEBUG oslo_concurrency.lockutils [req-610e7c2b-457a-49db-a3c0-711e7cc4d7f0 req-d1d3551f-9f8d-4a89-b164-1f8bedcabf56 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-25a251c7-ba38-4995-b170-837a778f9e89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:43:00 np0005603500 nova_compute[182934]: 2026-01-31 06:43:00.664 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquired lock "refresh_cache-25a251c7-ba38-4995-b170-837a778f9e89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:43:00 np0005603500 nova_compute[182934]: 2026-01-31 06:43:00.664 182938 DEBUG nova.network.neutron [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Jan 31 01:43:01 np0005603500 ovn_controller[95398]: 2026-01-31T06:43:01Z|00125|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 31 01:43:01 np0005603500 nova_compute[182934]: 2026-01-31 06:43:01.830 182938 DEBUG nova.network.neutron [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:43:03 np0005603500 nova_compute[182934]: 2026-01-31 06:43:03.743 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:03 np0005603500 nova_compute[182934]: 2026-01-31 06:43:03.923 182938 DEBUG nova.network.neutron [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Updating instance_info_cache with network_info: [{"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.092 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.429 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Releasing lock "refresh_cache-25a251c7-ba38-4995-b170-837a778f9e89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.430 182938 DEBUG nova.compute.manager [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Instance network_info: |[{"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.432 182938 DEBUG nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Start _get_guest_xml network_info=[{"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': '9f613975-b701-42a0-9b35-7d5c4a2cb7f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.436 182938 WARNING nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.437 182938 DEBUG nova.virt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-1240453589', uuid='25a251c7-ba38-4995-b170-837a778f9e89'), owner=OwnerMeta(userid='dddc34b0385a49a5bd9bf081ed29e9fd', username='tempest-TestNetworkBasicOps-1355800406-project-member', projectid='829310cd8381494e96216dba067ff8d3', projectname='tempest-TestNetworkBasicOps-1355800406'), image=ImageMeta(id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1769841784.437063) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.444 182938 DEBUG nova.virt.libvirt.host [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.445 182938 DEBUG nova.virt.libvirt.host [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.449 182938 DEBUG nova.virt.libvirt.host [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.449 182938 DEBUG nova.virt.libvirt.host [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.450 182938 DEBUG nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.450 182938 DEBUG nova.virt.hardware [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T06:29:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9956992e-a3ca-497f-9747-3ae270e07def',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.450 182938 DEBUG nova.virt.hardware [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.450 182938 DEBUG nova.virt.hardware [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.451 182938 DEBUG nova.virt.hardware [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.451 182938 DEBUG nova.virt.hardware [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.451 182938 DEBUG nova.virt.hardware [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.451 182938 DEBUG nova.virt.hardware [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.451 182938 DEBUG nova.virt.hardware [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.452 182938 DEBUG nova.virt.hardware [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.452 182938 DEBUG nova.virt.hardware [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.452 182938 DEBUG nova.virt.hardware [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.455 182938 DEBUG nova.virt.libvirt.vif [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:42:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1240453589',display_name='tempest-TestNetworkBasicOps-server-1240453589',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1240453589',id=8,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEfeKtSs+Xmx+OqbxkqS/QnJdOWLNSHvtf/o1Pd/v4Cr6Er3hqn25XJt4bfUCIpVW6CVNkiUTwKVmcamf17CqNMHIxc7S0r4TRko8LuUxn2c0Ap3DpPwXJQnG6By5ED+Gw==',key_name='tempest-TestNetworkBasicOps-428166205',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-t5risaf0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:42:55Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=25a251c7-ba38-4995-b170-837a778f9e89,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.456 182938 DEBUG nova.network.os_vif_util [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.456 182938 DEBUG nova.network.os_vif_util [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:86:f7,bridge_name='br-int',has_traffic_filtering=True,id=12d11bf4-59e7-416e-87e4-7fa01bcbfb01,network=Network(f945b6f3-ac5f-4949-a595-265a9c245851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap12d11bf4-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:43:04 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.457 182938 DEBUG nova.objects.instance [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 25a251c7-ba38-4995-b170-837a778f9e89 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:04.999 182938 DEBUG nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] End _get_guest_xml xml=<domain type="kvm">
Jan 31 01:43:05 np0005603500 nova_compute[182934]:  <uuid>25a251c7-ba38-4995-b170-837a778f9e89</uuid>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:  <name>instance-00000008</name>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:  <memory>131072</memory>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:  <vcpu>1</vcpu>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <nova:name>tempest-TestNetworkBasicOps-server-1240453589</nova:name>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <nova:creationTime>2026-01-31 06:43:04</nova:creationTime>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <nova:flavor name="m1.nano">
Jan 31 01:43:05 np0005603500 nova_compute[182934]:        <nova:memory>128</nova:memory>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:        <nova:disk>1</nova:disk>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:        <nova:swap>0</nova:swap>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:        <nova:vcpus>1</nova:vcpus>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      </nova:flavor>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <nova:owner>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:        <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:        <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      </nova:owner>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <nova:ports>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:        <nova:port uuid="12d11bf4-59e7-416e-87e4-7fa01bcbfb01">
Jan 31 01:43:05 np0005603500 nova_compute[182934]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:        </nova:port>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      </nova:ports>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    </nova:instance>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:  <sysinfo type="smbios">
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <entry name="manufacturer">RDO</entry>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <entry name="product">OpenStack Compute</entry>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <entry name="serial">25a251c7-ba38-4995-b170-837a778f9e89</entry>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <entry name="uuid">25a251c7-ba38-4995-b170-837a778f9e89</entry>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <entry name="family">Virtual Machine</entry>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <boot dev="hd"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <smbios mode="sysinfo"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <vmcoreinfo/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:  <clock offset="utc">
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <timer name="hpet" present="no"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:  <cpu mode="host-model" match="exact">
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <disk type="file" device="disk">
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/25a251c7-ba38-4995-b170-837a778f9e89/disk"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <target dev="vda" bus="virtio"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <disk type="file" device="cdrom">
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <driver name="qemu" type="raw" cache="none"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/25a251c7-ba38-4995-b170-837a778f9e89/disk.config"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <target dev="sda" bus="sata"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <interface type="ethernet">
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <mac address="fa:16:3e:ac:86:f7"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <mtu size="1442"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <target dev="tap12d11bf4-59"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <serial type="pty">
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <log file="/var/lib/nova/instances/25a251c7-ba38-4995-b170-837a778f9e89/console.log" append="off"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <input type="tablet" bus="usb"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <rng model="virtio">
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <backend model="random">/dev/urandom</backend>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <controller type="usb" index="0"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    <memballoon model="virtio">
Jan 31 01:43:05 np0005603500 nova_compute[182934]:      <stats period="10"/>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:43:05 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:43:05 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:43:05 np0005603500 nova_compute[182934]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.000 182938 DEBUG nova.compute.manager [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Preparing to wait for external event network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.000 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "25a251c7-ba38-4995-b170-837a778f9e89-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.001 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "25a251c7-ba38-4995-b170-837a778f9e89-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.001 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "25a251c7-ba38-4995-b170-837a778f9e89-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.002 182938 DEBUG nova.virt.libvirt.vif [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:42:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1240453589',display_name='tempest-TestNetworkBasicOps-server-1240453589',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1240453589',id=8,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEfeKtSs+Xmx+OqbxkqS/QnJdOWLNSHvtf/o1Pd/v4Cr6Er3hqn25XJt4bfUCIpVW6CVNkiUTwKVmcamf17CqNMHIxc7S0r4TRko8LuUxn2c0Ap3DpPwXJQnG6By5ED+Gw==',key_name='tempest-TestNetworkBasicOps-428166205',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-t5risaf0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:42:55Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=25a251c7-ba38-4995-b170-837a778f9e89,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.002 182938 DEBUG nova.network.os_vif_util [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.003 182938 DEBUG nova.network.os_vif_util [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:86:f7,bridge_name='br-int',has_traffic_filtering=True,id=12d11bf4-59e7-416e-87e4-7fa01bcbfb01,network=Network(f945b6f3-ac5f-4949-a595-265a9c245851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap12d11bf4-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.003 182938 DEBUG os_vif [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:86:f7,bridge_name='br-int',has_traffic_filtering=True,id=12d11bf4-59e7-416e-87e4-7fa01bcbfb01,network=Network(f945b6f3-ac5f-4949-a595-265a9c245851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap12d11bf4-59') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.003 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.004 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.004 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.005 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.005 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '9aa4cca6-1bf6-5c63-9332-34cc3ac60574', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.006 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.007 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.009 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.009 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap12d11bf4-59, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.010 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap12d11bf4-59, col_values=(('qos', UUID('b069ffe4-ddea-4793-9a78-3a19b3a25e63')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.010 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap12d11bf4-59, col_values=(('external_ids', {'iface-id': '12d11bf4-59e7-416e-87e4-7fa01bcbfb01', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ac:86:f7', 'vm-uuid': '25a251c7-ba38-4995-b170-837a778f9e89'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.011 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:05 np0005603500 NetworkManager[55506]: <info>  [1769841785.0126] manager: (tap12d11bf4-59): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.014 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.017 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:05 np0005603500 nova_compute[182934]: 2026-01-31 06:43:05.017 182938 INFO os_vif [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:86:f7,bridge_name='br-int',has_traffic_filtering=True,id=12d11bf4-59e7-416e-87e4-7fa01bcbfb01,network=Network(f945b6f3-ac5f-4949-a595-265a9c245851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap12d11bf4-59')
Jan 31 01:43:06 np0005603500 nova_compute[182934]: 2026-01-31 06:43:06.567 182938 DEBUG nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:43:06 np0005603500 nova_compute[182934]: 2026-01-31 06:43:06.568 182938 DEBUG nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:43:06 np0005603500 nova_compute[182934]: 2026-01-31 06:43:06.569 182938 DEBUG nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No VIF found with MAC fa:16:3e:ac:86:f7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Jan 31 01:43:06 np0005603500 nova_compute[182934]: 2026-01-31 06:43:06.570 182938 INFO nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Using config drive
Jan 31 01:43:08 np0005603500 podman[216709]: 2026-01-31 06:43:08.132269951 +0000 UTC m=+0.050993287 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=openstack_network_exporter, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, vcs-type=git)
Jan 31 01:43:08 np0005603500 podman[216708]: 2026-01-31 06:43:08.151282812 +0000 UTC m=+0.069060369 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 01:43:08 np0005603500 nova_compute[182934]: 2026-01-31 06:43:08.169 182938 INFO nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Creating config drive at /var/lib/nova/instances/25a251c7-ba38-4995-b170-837a778f9e89/disk.config
Jan 31 01:43:08 np0005603500 nova_compute[182934]: 2026-01-31 06:43:08.174 182938 DEBUG oslo_concurrency.processutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/25a251c7-ba38-4995-b170-837a778f9e89/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpa3l2pvo4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:43:08 np0005603500 nova_compute[182934]: 2026-01-31 06:43:08.295 182938 DEBUG oslo_concurrency.processutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/25a251c7-ba38-4995-b170-837a778f9e89/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpa3l2pvo4" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:43:08 np0005603500 kernel: tap12d11bf4-59: entered promiscuous mode
Jan 31 01:43:08 np0005603500 NetworkManager[55506]: <info>  [1769841788.3407] manager: (tap12d11bf4-59): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Jan 31 01:43:08 np0005603500 ovn_controller[95398]: 2026-01-31T06:43:08Z|00126|binding|INFO|Claiming lport 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 for this chassis.
Jan 31 01:43:08 np0005603500 ovn_controller[95398]: 2026-01-31T06:43:08Z|00127|binding|INFO|12d11bf4-59e7-416e-87e4-7fa01bcbfb01: Claiming fa:16:3e:ac:86:f7 10.100.0.14
Jan 31 01:43:08 np0005603500 nova_compute[182934]: 2026-01-31 06:43:08.342 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:08 np0005603500 nova_compute[182934]: 2026-01-31 06:43:08.349 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.355 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:86:f7 10.100.0.14'], port_security=['fa:16:3e:ac:86:f7 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-603432797', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '25a251c7-ba38-4995-b170-837a778f9e89', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f945b6f3-ac5f-4949-a595-265a9c245851', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-603432797', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '859450e8-8706-4a8f-9018-d6c18f1c21cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=074897e1-acb2-45fc-95e8-d7347eaae0a8, chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=12d11bf4-59e7-416e-87e4-7fa01bcbfb01) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.357 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 in datapath f945b6f3-ac5f-4949-a595-265a9c245851 bound to our chassis
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.358 104644 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f945b6f3-ac5f-4949-a595-265a9c245851
Jan 31 01:43:08 np0005603500 nova_compute[182934]: 2026-01-31 06:43:08.364 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:08 np0005603500 ovn_controller[95398]: 2026-01-31T06:43:08Z|00128|binding|INFO|Setting lport 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 ovn-installed in OVS
Jan 31 01:43:08 np0005603500 ovn_controller[95398]: 2026-01-31T06:43:08Z|00129|binding|INFO|Setting lport 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 up in Southbound
Jan 31 01:43:08 np0005603500 nova_compute[182934]: 2026-01-31 06:43:08.368 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:08 np0005603500 systemd-udevd[216774]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.370 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[19aa1cb2-f707-48dd-8813-7ba4593f87a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.373 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf945b6f3-a1 in ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Jan 31 01:43:08 np0005603500 systemd-machined[154375]: New machine qemu-8-instance-00000008.
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.375 210946 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf945b6f3-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.375 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[7cb15237-54b5-4079-8d58-aeeb005e0842]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.377 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[b8d7946c-25a0-4629-a4a3-5bbdc33ea025]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:08 np0005603500 NetworkManager[55506]: <info>  [1769841788.3805] device (tap12d11bf4-59): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:43:08 np0005603500 NetworkManager[55506]: <info>  [1769841788.3813] device (tap12d11bf4-59): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 01:43:08 np0005603500 systemd[1]: Started Virtual Machine qemu-8-instance-00000008.
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.387 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[53a9f5e1-cb8e-4bbe-b752-8b009ff26996]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.399 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[cbb8d2b1-1b23-4907-a987-aa7be46fe159]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.416 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[bcc7f77d-1e86-458e-bcf1-e77219ce20d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:08 np0005603500 systemd-udevd[216776]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.422 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[93ecd86d-6823-43ce-8c1e-ab588624c4c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:08 np0005603500 NetworkManager[55506]: <info>  [1769841788.4229] manager: (tapf945b6f3-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/63)
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.442 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[570d26eb-ee90-4d53-b991-b4ccbff99c32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.444 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[ce328ce4-f0b8-4930-a0d8-4aa8040ad738]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:08 np0005603500 NetworkManager[55506]: <info>  [1769841788.4602] device (tapf945b6f3-a0): carrier: link connected
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.462 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[d6c48b73-b7d3-4162-be21-0ba3a41afecc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.477 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[836180f3-96f8-41f6-893c-a5a5470db43b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf945b6f3-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:fd:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409975, 'reachable_time': 20937, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216806, 'error': None, 'target': 'ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.491 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[c8098177-cae2-4144-9467-c1b0f7354150]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9b:fd9c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 409975, 'tstamp': 409975}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216807, 'error': None, 'target': 'ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.505 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[b83e4464-7582-4888-bcbe-9b149fe9d490]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf945b6f3-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:fd:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 1, 'rx_bytes': 266, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 1, 'rx_bytes': 266, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409975, 'reachable_time': 20937, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216808, 'error': None, 'target': 'ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.526 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[83073792-abbf-4e4d-b474-c27ba65da28a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.580 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[e6287ef8-86b0-4db0-b0c3-c7d5ee38c036]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.582 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf945b6f3-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.582 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.583 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf945b6f3-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:43:08 np0005603500 nova_compute[182934]: 2026-01-31 06:43:08.585 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:08 np0005603500 NetworkManager[55506]: <info>  [1769841788.5858] manager: (tapf945b6f3-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Jan 31 01:43:08 np0005603500 kernel: tapf945b6f3-a0: entered promiscuous mode
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.588 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf945b6f3-a0, col_values=(('external_ids', {'iface-id': '1539f259-b2fb-49d9-b369-ffe8f622864d'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:43:08 np0005603500 nova_compute[182934]: 2026-01-31 06:43:08.590 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:08 np0005603500 ovn_controller[95398]: 2026-01-31T06:43:08Z|00130|binding|INFO|Releasing lport 1539f259-b2fb-49d9-b369-ffe8f622864d from this chassis (sb_readonly=0)
Jan 31 01:43:08 np0005603500 nova_compute[182934]: 2026-01-31 06:43:08.591 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.592 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[4c079583-4dc6-49fb-8d33-6b5871c9e049]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.593 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.593 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.594 104644 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for f945b6f3-ac5f-4949-a595-265a9c245851 disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.594 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.595 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[c7a803ca-879a-4d7a-b393-77a26f562a5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:08 np0005603500 nova_compute[182934]: 2026-01-31 06:43:08.595 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.596 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.596 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[0b18e048-bfa8-4f2f-8e37-e1bb52ed0c7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.597 104644 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: global
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    log         /dev/log local0 debug
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    log-tag     haproxy-metadata-proxy-f945b6f3-ac5f-4949-a595-265a9c245851
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    user        root
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    group       root
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    maxconn     1024
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    pidfile     /var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    daemon
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: defaults
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    log global
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    mode http
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    option httplog
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    option dontlognull
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    option http-server-close
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    option forwardfor
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    retries                 3
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    timeout http-request    30s
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    timeout connect         30s
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    timeout client          32s
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    timeout server          32s
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    timeout http-keep-alive 30s
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: listen listener
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    bind 169.254.169.254:80
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]:    http-request add-header X-OVN-Network-ID f945b6f3-ac5f-4949-a595-265a9c245851
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 31 01:43:08 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:08.599 104644 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851', 'env', 'PROCESS_TAG=haproxy-f945b6f3-ac5f-4949-a595-265a9c245851', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f945b6f3-ac5f-4949-a595-265a9c245851.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Jan 31 01:43:08 np0005603500 nova_compute[182934]: 2026-01-31 06:43:08.744 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:08 np0005603500 podman[216847]: 2026-01-31 06:43:08.903687321 +0000 UTC m=+0.042815853 container create f9291ea8f0cb93284408ce25aa285926d180fee402ee0787a4f4bcac1722f353 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Jan 31 01:43:08 np0005603500 systemd[1]: Started libpod-conmon-f9291ea8f0cb93284408ce25aa285926d180fee402ee0787a4f4bcac1722f353.scope.
Jan 31 01:43:08 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:43:08 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c70fc7437544b42e02ef9f0d788c5de48ecec01d6bc7f018c38ba671d30d03e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 01:43:08 np0005603500 podman[216847]: 2026-01-31 06:43:08.956226444 +0000 UTC m=+0.095355026 container init f9291ea8f0cb93284408ce25aa285926d180fee402ee0787a4f4bcac1722f353 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:43:08 np0005603500 podman[216847]: 2026-01-31 06:43:08.959974891 +0000 UTC m=+0.099103443 container start f9291ea8f0cb93284408ce25aa285926d180fee402ee0787a4f4bcac1722f353 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:43:08 np0005603500 podman[216847]: 2026-01-31 06:43:08.878531288 +0000 UTC m=+0.017659880 image pull d52ce0b189025039ce86fc9564595bcce243e95c598f912f021ea09cd4116a16 quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:43:08 np0005603500 neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851[216862]: [NOTICE]   (216866) : New worker (216868) forked
Jan 31 01:43:08 np0005603500 neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851[216862]: [NOTICE]   (216866) : Loading success.
Jan 31 01:43:09 np0005603500 nova_compute[182934]: 2026-01-31 06:43:09.104 182938 DEBUG nova.compute.manager [req-7297c8ad-d3a1-4613-b13d-4a5b492563fa req-c5636211-b883-44e7-9ca3-2b33ae7d2274 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Received event network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:43:09 np0005603500 nova_compute[182934]: 2026-01-31 06:43:09.105 182938 DEBUG oslo_concurrency.lockutils [req-7297c8ad-d3a1-4613-b13d-4a5b492563fa req-c5636211-b883-44e7-9ca3-2b33ae7d2274 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "25a251c7-ba38-4995-b170-837a778f9e89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:43:09 np0005603500 nova_compute[182934]: 2026-01-31 06:43:09.105 182938 DEBUG oslo_concurrency.lockutils [req-7297c8ad-d3a1-4613-b13d-4a5b492563fa req-c5636211-b883-44e7-9ca3-2b33ae7d2274 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "25a251c7-ba38-4995-b170-837a778f9e89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:43:09 np0005603500 nova_compute[182934]: 2026-01-31 06:43:09.105 182938 DEBUG oslo_concurrency.lockutils [req-7297c8ad-d3a1-4613-b13d-4a5b492563fa req-c5636211-b883-44e7-9ca3-2b33ae7d2274 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "25a251c7-ba38-4995-b170-837a778f9e89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:43:09 np0005603500 nova_compute[182934]: 2026-01-31 06:43:09.106 182938 DEBUG nova.compute.manager [req-7297c8ad-d3a1-4613-b13d-4a5b492563fa req-c5636211-b883-44e7-9ca3-2b33ae7d2274 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Processing event network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Jan 31 01:43:09 np0005603500 nova_compute[182934]: 2026-01-31 06:43:09.106 182938 DEBUG nova.compute.manager [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Jan 31 01:43:09 np0005603500 nova_compute[182934]: 2026-01-31 06:43:09.110 182938 DEBUG nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Jan 31 01:43:09 np0005603500 nova_compute[182934]: 2026-01-31 06:43:09.113 182938 INFO nova.virt.libvirt.driver [-] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Instance spawned successfully.
Jan 31 01:43:09 np0005603500 nova_compute[182934]: 2026-01-31 06:43:09.114 182938 DEBUG nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Jan 31 01:43:09 np0005603500 nova_compute[182934]: 2026-01-31 06:43:09.625 182938 DEBUG nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:43:09 np0005603500 nova_compute[182934]: 2026-01-31 06:43:09.626 182938 DEBUG nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:43:09 np0005603500 nova_compute[182934]: 2026-01-31 06:43:09.626 182938 DEBUG nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:43:09 np0005603500 nova_compute[182934]: 2026-01-31 06:43:09.627 182938 DEBUG nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:43:09 np0005603500 nova_compute[182934]: 2026-01-31 06:43:09.627 182938 DEBUG nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:43:09 np0005603500 nova_compute[182934]: 2026-01-31 06:43:09.628 182938 DEBUG nova.virt.libvirt.driver [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:43:10 np0005603500 nova_compute[182934]: 2026-01-31 06:43:10.013 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:10 np0005603500 nova_compute[182934]: 2026-01-31 06:43:10.163 182938 INFO nova.compute.manager [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Took 13.87 seconds to spawn the instance on the hypervisor.
Jan 31 01:43:10 np0005603500 nova_compute[182934]: 2026-01-31 06:43:10.164 182938 DEBUG nova.compute.manager [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Jan 31 01:43:10 np0005603500 nova_compute[182934]: 2026-01-31 06:43:10.743 182938 INFO nova.compute.manager [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Took 19.15 seconds to build instance.
Jan 31 01:43:11 np0005603500 nova_compute[182934]: 2026-01-31 06:43:11.298 182938 DEBUG oslo_concurrency.lockutils [None req-935d623a-9a9a-409b-b20f-89695e5dbacd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "25a251c7-ba38-4995-b170-837a778f9e89" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.726s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:43:11 np0005603500 nova_compute[182934]: 2026-01-31 06:43:11.420 182938 DEBUG nova.compute.manager [req-ce7c34ad-eec3-4105-9754-dfecb9f5e065 req-30588470-5d3d-40ff-b1d3-0a72c24f4686 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Received event network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:43:11 np0005603500 nova_compute[182934]: 2026-01-31 06:43:11.421 182938 DEBUG oslo_concurrency.lockutils [req-ce7c34ad-eec3-4105-9754-dfecb9f5e065 req-30588470-5d3d-40ff-b1d3-0a72c24f4686 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "25a251c7-ba38-4995-b170-837a778f9e89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:43:11 np0005603500 nova_compute[182934]: 2026-01-31 06:43:11.421 182938 DEBUG oslo_concurrency.lockutils [req-ce7c34ad-eec3-4105-9754-dfecb9f5e065 req-30588470-5d3d-40ff-b1d3-0a72c24f4686 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "25a251c7-ba38-4995-b170-837a778f9e89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:43:11 np0005603500 nova_compute[182934]: 2026-01-31 06:43:11.422 182938 DEBUG oslo_concurrency.lockutils [req-ce7c34ad-eec3-4105-9754-dfecb9f5e065 req-30588470-5d3d-40ff-b1d3-0a72c24f4686 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "25a251c7-ba38-4995-b170-837a778f9e89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:43:11 np0005603500 nova_compute[182934]: 2026-01-31 06:43:11.422 182938 DEBUG nova.compute.manager [req-ce7c34ad-eec3-4105-9754-dfecb9f5e065 req-30588470-5d3d-40ff-b1d3-0a72c24f4686 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] No waiting events found dispatching network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:43:11 np0005603500 nova_compute[182934]: 2026-01-31 06:43:11.425 182938 WARNING nova.compute.manager [req-ce7c34ad-eec3-4105-9754-dfecb9f5e065 req-30588470-5d3d-40ff-b1d3-0a72c24f4686 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Received unexpected event network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 for instance with vm_state active and task_state None.
Jan 31 01:43:13 np0005603500 nova_compute[182934]: 2026-01-31 06:43:13.746 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:15 np0005603500 nova_compute[182934]: 2026-01-31 06:43:15.016 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:16 np0005603500 podman[216878]: 2026-01-31 06:43:16.156422457 +0000 UTC m=+0.069258055 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 31 01:43:16 np0005603500 podman[216877]: 2026-01-31 06:43:16.166402038 +0000 UTC m=+0.083873069 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:43:16 np0005603500 ovn_controller[95398]: 2026-01-31T06:43:16Z|00131|binding|INFO|Releasing lport 1539f259-b2fb-49d9-b369-ffe8f622864d from this chassis (sb_readonly=0)
Jan 31 01:43:16 np0005603500 nova_compute[182934]: 2026-01-31 06:43:16.872 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:16 np0005603500 NetworkManager[55506]: <info>  [1769841796.8732] manager: (patch-br-int-to-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Jan 31 01:43:16 np0005603500 NetworkManager[55506]: <info>  [1769841796.8740] manager: (patch-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Jan 31 01:43:16 np0005603500 ovn_controller[95398]: 2026-01-31T06:43:16Z|00132|binding|INFO|Releasing lport 1539f259-b2fb-49d9-b369-ffe8f622864d from this chassis (sb_readonly=0)
Jan 31 01:43:16 np0005603500 nova_compute[182934]: 2026-01-31 06:43:16.879 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:16 np0005603500 nova_compute[182934]: 2026-01-31 06:43:16.883 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:17 np0005603500 nova_compute[182934]: 2026-01-31 06:43:17.574 182938 DEBUG nova.compute.manager [req-746f2463-a637-4409-99f8-43fa142ea71e req-5c146a0a-9cfc-41ff-abd3-5a13bc9db873 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Received event network-changed-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:43:17 np0005603500 nova_compute[182934]: 2026-01-31 06:43:17.575 182938 DEBUG nova.compute.manager [req-746f2463-a637-4409-99f8-43fa142ea71e req-5c146a0a-9cfc-41ff-abd3-5a13bc9db873 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Refreshing instance network info cache due to event network-changed-12d11bf4-59e7-416e-87e4-7fa01bcbfb01. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:43:17 np0005603500 nova_compute[182934]: 2026-01-31 06:43:17.575 182938 DEBUG oslo_concurrency.lockutils [req-746f2463-a637-4409-99f8-43fa142ea71e req-5c146a0a-9cfc-41ff-abd3-5a13bc9db873 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-25a251c7-ba38-4995-b170-837a778f9e89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:43:17 np0005603500 nova_compute[182934]: 2026-01-31 06:43:17.575 182938 DEBUG oslo_concurrency.lockutils [req-746f2463-a637-4409-99f8-43fa142ea71e req-5c146a0a-9cfc-41ff-abd3-5a13bc9db873 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-25a251c7-ba38-4995-b170-837a778f9e89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:43:17 np0005603500 nova_compute[182934]: 2026-01-31 06:43:17.576 182938 DEBUG nova.network.neutron [req-746f2463-a637-4409-99f8-43fa142ea71e req-5c146a0a-9cfc-41ff-abd3-5a13bc9db873 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Refreshing network info cache for port 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:43:18 np0005603500 nova_compute[182934]: 2026-01-31 06:43:18.748 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:18 np0005603500 nova_compute[182934]: 2026-01-31 06:43:18.936 182938 DEBUG oslo_concurrency.lockutils [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "25a251c7-ba38-4995-b170-837a778f9e89" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:43:18 np0005603500 nova_compute[182934]: 2026-01-31 06:43:18.937 182938 DEBUG oslo_concurrency.lockutils [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "25a251c7-ba38-4995-b170-837a778f9e89" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:43:18 np0005603500 nova_compute[182934]: 2026-01-31 06:43:18.938 182938 DEBUG oslo_concurrency.lockutils [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "25a251c7-ba38-4995-b170-837a778f9e89-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:43:18 np0005603500 nova_compute[182934]: 2026-01-31 06:43:18.938 182938 DEBUG oslo_concurrency.lockutils [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "25a251c7-ba38-4995-b170-837a778f9e89-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:43:18 np0005603500 nova_compute[182934]: 2026-01-31 06:43:18.938 182938 DEBUG oslo_concurrency.lockutils [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "25a251c7-ba38-4995-b170-837a778f9e89-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:43:18 np0005603500 nova_compute[182934]: 2026-01-31 06:43:18.939 182938 INFO nova.compute.manager [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Terminating instance
Jan 31 01:43:19 np0005603500 nova_compute[182934]: 2026-01-31 06:43:19.609 182938 DEBUG nova.compute.manager [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.019 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:20 np0005603500 kernel: tap12d11bf4-59 (unregistering): left promiscuous mode
Jan 31 01:43:20 np0005603500 NetworkManager[55506]: <info>  [1769841800.0630] device (tap12d11bf4-59): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 01:43:20 np0005603500 ovn_controller[95398]: 2026-01-31T06:43:20Z|00133|binding|INFO|Releasing lport 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 from this chassis (sb_readonly=0)
Jan 31 01:43:20 np0005603500 ovn_controller[95398]: 2026-01-31T06:43:20Z|00134|binding|INFO|Setting lport 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 down in Southbound
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.066 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:20 np0005603500 ovn_controller[95398]: 2026-01-31T06:43:20Z|00135|binding|INFO|Removing iface tap12d11bf4-59 ovn-installed in OVS
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.069 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.073 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:20 np0005603500 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Deactivated successfully.
Jan 31 01:43:20 np0005603500 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Consumed 10.283s CPU time.
Jan 31 01:43:20 np0005603500 systemd-machined[154375]: Machine qemu-8-instance-00000008 terminated.
Jan 31 01:43:20 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:20.156 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:86:f7 10.100.0.14'], port_security=['fa:16:3e:ac:86:f7 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-603432797', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '25a251c7-ba38-4995-b170-837a778f9e89', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f945b6f3-ac5f-4949-a595-265a9c245851', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-603432797', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '859450e8-8706-4a8f-9018-d6c18f1c21cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.230'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=074897e1-acb2-45fc-95e8-d7347eaae0a8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=12d11bf4-59e7-416e-87e4-7fa01bcbfb01) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:43:20 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:20.158 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 in datapath f945b6f3-ac5f-4949-a595-265a9c245851 unbound from our chassis
Jan 31 01:43:20 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:20.159 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f945b6f3-ac5f-4949-a595-265a9c245851, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:43:20 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:20.159 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[f0d0572d-86bc-435a-96c2-727bfe8754ce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:20 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:20.160 104644 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851 namespace which is not needed anymore
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.226 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.229 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:20 np0005603500 podman[216952]: 2026-01-31 06:43:20.250472205 +0000 UTC m=+0.027533447 container kill f9291ea8f0cb93284408ce25aa285926d180fee402ee0787a4f4bcac1722f353 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127)
Jan 31 01:43:20 np0005603500 neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851[216862]: [NOTICE]   (216866) : haproxy version is 2.8.14-c23fe91
Jan 31 01:43:20 np0005603500 neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851[216862]: [NOTICE]   (216866) : path to executable is /usr/sbin/haproxy
Jan 31 01:43:20 np0005603500 neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851[216862]: [WARNING]  (216866) : Exiting Master process...
Jan 31 01:43:20 np0005603500 neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851[216862]: [ALERT]    (216866) : Current worker (216868) exited with code 143 (Terminated)
Jan 31 01:43:20 np0005603500 neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851[216862]: [WARNING]  (216866) : All workers exited. Exiting... (0)
Jan 31 01:43:20 np0005603500 systemd[1]: libpod-f9291ea8f0cb93284408ce25aa285926d180fee402ee0787a4f4bcac1722f353.scope: Deactivated successfully.
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.262 182938 INFO nova.virt.libvirt.driver [-] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Instance destroyed successfully.
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.263 182938 DEBUG nova.objects.instance [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'resources' on Instance uuid 25a251c7-ba38-4995-b170-837a778f9e89 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:43:20 np0005603500 podman[216981]: 2026-01-31 06:43:20.297529178 +0000 UTC m=+0.029461907 container died f9291ea8f0cb93284408ce25aa285926d180fee402ee0787a4f4bcac1722f353 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 01:43:20 np0005603500 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f9291ea8f0cb93284408ce25aa285926d180fee402ee0787a4f4bcac1722f353-userdata-shm.mount: Deactivated successfully.
Jan 31 01:43:20 np0005603500 systemd[1]: var-lib-containers-storage-overlay-4c70fc7437544b42e02ef9f0d788c5de48ecec01d6bc7f018c38ba671d30d03e-merged.mount: Deactivated successfully.
Jan 31 01:43:20 np0005603500 podman[216981]: 2026-01-31 06:43:20.352684543 +0000 UTC m=+0.084617252 container cleanup f9291ea8f0cb93284408ce25aa285926d180fee402ee0787a4f4bcac1722f353 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 31 01:43:20 np0005603500 systemd[1]: libpod-conmon-f9291ea8f0cb93284408ce25aa285926d180fee402ee0787a4f4bcac1722f353.scope: Deactivated successfully.
Jan 31 01:43:20 np0005603500 podman[216993]: 2026-01-31 06:43:20.37474389 +0000 UTC m=+0.070195444 container remove f9291ea8f0cb93284408ce25aa285926d180fee402ee0787a4f4bcac1722f353 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:43:20 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:20.378 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[a4e869ba-02dc-4637-bf2b-4c2568cb3681]: (4, ("Sat Jan 31 06:43:20 AM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851 (f9291ea8f0cb93284408ce25aa285926d180fee402ee0787a4f4bcac1722f353)\nf9291ea8f0cb93284408ce25aa285926d180fee402ee0787a4f4bcac1722f353\nSat Jan 31 06:43:20 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851 (f9291ea8f0cb93284408ce25aa285926d180fee402ee0787a4f4bcac1722f353)\nf9291ea8f0cb93284408ce25aa285926d180fee402ee0787a4f4bcac1722f353\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:20 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:20.379 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[11ecd5dc-bd1b-45c7-8384-f46531db47d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:20 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:20.380 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:43:20 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:20.380 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[192dc051-b8e7-47de-9270-917a5d79ae70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:20 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:20.381 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf945b6f3-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.382 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:20 np0005603500 kernel: tapf945b6f3-a0: left promiscuous mode
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.387 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.390 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:20 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:20.392 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[8679e719-51ed-4ee0-9eaa-b1fa9dcaf92e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:20 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:20.404 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[ebdcb1d4-1cfd-46bf-871f-79e1ace591d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:20 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:20.405 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[bfefd0a3-a86f-407b-a062-e935c2f4aa5d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:20 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:20.416 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[281fc963-70d3-484e-a6a3-553adaaee117]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409971, 'reachable_time': 17871, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217016, 'error': None, 'target': 'ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:20 np0005603500 systemd[1]: run-netns-ovnmeta\x2df945b6f3\x2dac5f\x2d4949\x2da595\x2d265a9c245851.mount: Deactivated successfully.
Jan 31 01:43:20 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:20.420 105168 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 31 01:43:20 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:20.420 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[ff325891-e832-4594-8aec-b26b604acd33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.667 182938 DEBUG nova.compute.manager [req-4ad0d0dd-5ffe-4548-91b4-63d99a61aa68 req-635adbd5-dc06-4e05-b14f-a0ca8e123217 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Received event network-vif-unplugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.668 182938 DEBUG oslo_concurrency.lockutils [req-4ad0d0dd-5ffe-4548-91b4-63d99a61aa68 req-635adbd5-dc06-4e05-b14f-a0ca8e123217 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "25a251c7-ba38-4995-b170-837a778f9e89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.669 182938 DEBUG oslo_concurrency.lockutils [req-4ad0d0dd-5ffe-4548-91b4-63d99a61aa68 req-635adbd5-dc06-4e05-b14f-a0ca8e123217 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "25a251c7-ba38-4995-b170-837a778f9e89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.669 182938 DEBUG oslo_concurrency.lockutils [req-4ad0d0dd-5ffe-4548-91b4-63d99a61aa68 req-635adbd5-dc06-4e05-b14f-a0ca8e123217 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "25a251c7-ba38-4995-b170-837a778f9e89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.669 182938 DEBUG nova.compute.manager [req-4ad0d0dd-5ffe-4548-91b4-63d99a61aa68 req-635adbd5-dc06-4e05-b14f-a0ca8e123217 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] No waiting events found dispatching network-vif-unplugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.670 182938 DEBUG nova.compute.manager [req-4ad0d0dd-5ffe-4548-91b4-63d99a61aa68 req-635adbd5-dc06-4e05-b14f-a0ca8e123217 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Received event network-vif-unplugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.900 182938 DEBUG nova.virt.libvirt.vif [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:42:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1240453589',display_name='tempest-TestNetworkBasicOps-server-1240453589',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1240453589',id=8,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEfeKtSs+Xmx+OqbxkqS/QnJdOWLNSHvtf/o1Pd/v4Cr6Er3hqn25XJt4bfUCIpVW6CVNkiUTwKVmcamf17CqNMHIxc7S0r4TRko8LuUxn2c0Ap3DpPwXJQnG6By5ED+Gw==',key_name='tempest-TestNetworkBasicOps-428166205',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:43:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-t5risaf0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:43:10Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=25a251c7-ba38-4995-b170-837a778f9e89,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.900 182938 DEBUG nova.network.os_vif_util [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.901 182938 DEBUG nova.network.os_vif_util [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:86:f7,bridge_name='br-int',has_traffic_filtering=True,id=12d11bf4-59e7-416e-87e4-7fa01bcbfb01,network=Network(f945b6f3-ac5f-4949-a595-265a9c245851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap12d11bf4-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.901 182938 DEBUG os_vif [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:86:f7,bridge_name='br-int',has_traffic_filtering=True,id=12d11bf4-59e7-416e-87e4-7fa01bcbfb01,network=Network(f945b6f3-ac5f-4949-a595-265a9c245851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap12d11bf4-59') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.903 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.904 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap12d11bf4-59, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.905 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.906 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.908 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.908 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=b069ffe4-ddea-4793-9a78-3a19b3a25e63) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.911 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.912 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.914 182938 INFO os_vif [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:86:f7,bridge_name='br-int',has_traffic_filtering=True,id=12d11bf4-59e7-416e-87e4-7fa01bcbfb01,network=Network(f945b6f3-ac5f-4949-a595-265a9c245851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap12d11bf4-59')
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.914 182938 INFO nova.virt.libvirt.driver [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Deleting instance files /var/lib/nova/instances/25a251c7-ba38-4995-b170-837a778f9e89_del
Jan 31 01:43:20 np0005603500 nova_compute[182934]: 2026-01-31 06:43:20.915 182938 INFO nova.virt.libvirt.driver [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Deletion of /var/lib/nova/instances/25a251c7-ba38-4995-b170-837a778f9e89_del complete
Jan 31 01:43:21 np0005603500 nova_compute[182934]: 2026-01-31 06:43:21.376 182938 DEBUG nova.network.neutron [req-746f2463-a637-4409-99f8-43fa142ea71e req-5c146a0a-9cfc-41ff-abd3-5a13bc9db873 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Updated VIF entry in instance network info cache for port 12d11bf4-59e7-416e-87e4-7fa01bcbfb01. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:43:21 np0005603500 nova_compute[182934]: 2026-01-31 06:43:21.376 182938 DEBUG nova.network.neutron [req-746f2463-a637-4409-99f8-43fa142ea71e req-5c146a0a-9cfc-41ff-abd3-5a13bc9db873 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Updating instance_info_cache with network_info: [{"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:43:21 np0005603500 nova_compute[182934]: 2026-01-31 06:43:21.450 182938 INFO nova.compute.manager [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Took 1.84 seconds to destroy the instance on the hypervisor.
Jan 31 01:43:21 np0005603500 nova_compute[182934]: 2026-01-31 06:43:21.450 182938 DEBUG oslo.service.backend.eventlet.loopingcall [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Jan 31 01:43:21 np0005603500 nova_compute[182934]: 2026-01-31 06:43:21.451 182938 DEBUG nova.compute.manager [-] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Jan 31 01:43:21 np0005603500 nova_compute[182934]: 2026-01-31 06:43:21.451 182938 DEBUG nova.network.neutron [-] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Jan 31 01:43:21 np0005603500 nova_compute[182934]: 2026-01-31 06:43:21.973 182938 DEBUG oslo_concurrency.lockutils [req-746f2463-a637-4409-99f8-43fa142ea71e req-5c146a0a-9cfc-41ff-abd3-5a13bc9db873 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-25a251c7-ba38-4995-b170-837a778f9e89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:43:22 np0005603500 nova_compute[182934]: 2026-01-31 06:43:22.912 182938 DEBUG nova.compute.manager [req-a9f305bd-004a-4a62-b189-20cc7f618878 req-f8e67ba6-370c-46ab-a654-ccbe5863fc39 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Received event network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:43:22 np0005603500 nova_compute[182934]: 2026-01-31 06:43:22.912 182938 DEBUG oslo_concurrency.lockutils [req-a9f305bd-004a-4a62-b189-20cc7f618878 req-f8e67ba6-370c-46ab-a654-ccbe5863fc39 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "25a251c7-ba38-4995-b170-837a778f9e89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:43:22 np0005603500 nova_compute[182934]: 2026-01-31 06:43:22.912 182938 DEBUG oslo_concurrency.lockutils [req-a9f305bd-004a-4a62-b189-20cc7f618878 req-f8e67ba6-370c-46ab-a654-ccbe5863fc39 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "25a251c7-ba38-4995-b170-837a778f9e89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:43:22 np0005603500 nova_compute[182934]: 2026-01-31 06:43:22.912 182938 DEBUG oslo_concurrency.lockutils [req-a9f305bd-004a-4a62-b189-20cc7f618878 req-f8e67ba6-370c-46ab-a654-ccbe5863fc39 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "25a251c7-ba38-4995-b170-837a778f9e89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:43:22 np0005603500 nova_compute[182934]: 2026-01-31 06:43:22.913 182938 DEBUG nova.compute.manager [req-a9f305bd-004a-4a62-b189-20cc7f618878 req-f8e67ba6-370c-46ab-a654-ccbe5863fc39 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] No waiting events found dispatching network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:43:22 np0005603500 nova_compute[182934]: 2026-01-31 06:43:22.913 182938 WARNING nova.compute.manager [req-a9f305bd-004a-4a62-b189-20cc7f618878 req-f8e67ba6-370c-46ab-a654-ccbe5863fc39 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Received unexpected event network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 for instance with vm_state active and task_state deleting.
Jan 31 01:43:23 np0005603500 nova_compute[182934]: 2026-01-31 06:43:23.751 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:25 np0005603500 nova_compute[182934]: 2026-01-31 06:43:25.285 182938 DEBUG nova.network.neutron [-] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:43:25 np0005603500 nova_compute[182934]: 2026-01-31 06:43:25.870 182938 INFO nova.compute.manager [-] [instance: 25a251c7-ba38-4995-b170-837a778f9e89] Took 4.42 seconds to deallocate network for instance.
Jan 31 01:43:25 np0005603500 nova_compute[182934]: 2026-01-31 06:43:25.910 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:26 np0005603500 nova_compute[182934]: 2026-01-31 06:43:26.418 182938 DEBUG oslo_concurrency.lockutils [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:43:26 np0005603500 nova_compute[182934]: 2026-01-31 06:43:26.418 182938 DEBUG oslo_concurrency.lockutils [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:43:26 np0005603500 nova_compute[182934]: 2026-01-31 06:43:26.855 182938 DEBUG nova.compute.provider_tree [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:43:27 np0005603500 nova_compute[182934]: 2026-01-31 06:43:27.376 182938 DEBUG nova.scheduler.client.report [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:43:27 np0005603500 nova_compute[182934]: 2026-01-31 06:43:27.918 182938 DEBUG oslo_concurrency.lockutils [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.500s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:43:28 np0005603500 nova_compute[182934]: 2026-01-31 06:43:28.004 182938 INFO nova.scheduler.client.report [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Deleted allocations for instance 25a251c7-ba38-4995-b170-837a778f9e89
Jan 31 01:43:28 np0005603500 nova_compute[182934]: 2026-01-31 06:43:28.753 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:29 np0005603500 nova_compute[182934]: 2026-01-31 06:43:29.121 182938 DEBUG oslo_concurrency.lockutils [None req-1ea4bd9e-0e74-45e2-a17f-ab07c2a6cdec dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "25a251c7-ba38-4995-b170-837a778f9e89" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:43:30 np0005603500 podman[217018]: 2026-01-31 06:43:30.131422863 +0000 UTC m=+0.044038201 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 31 01:43:30 np0005603500 podman[217017]: 2026-01-31 06:43:30.131418182 +0000 UTC m=+0.045282289 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 31 01:43:30 np0005603500 nova_compute[182934]: 2026-01-31 06:43:30.940 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:33 np0005603500 nova_compute[182934]: 2026-01-31 06:43:33.755 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:34 np0005603500 nova_compute[182934]: 2026-01-31 06:43:34.952 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:34 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:34.952 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:43:34 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:34.954 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:43:35 np0005603500 nova_compute[182934]: 2026-01-31 06:43:35.942 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:36 np0005603500 nova_compute[182934]: 2026-01-31 06:43:36.143 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:43:36 np0005603500 nova_compute[182934]: 2026-01-31 06:43:36.728 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:43:37 np0005603500 nova_compute[182934]: 2026-01-31 06:43:37.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:43:37 np0005603500 nova_compute[182934]: 2026-01-31 06:43:37.674 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:43:37 np0005603500 nova_compute[182934]: 2026-01-31 06:43:37.675 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:43:37 np0005603500 nova_compute[182934]: 2026-01-31 06:43:37.675 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:43:37 np0005603500 nova_compute[182934]: 2026-01-31 06:43:37.675 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:43:37 np0005603500 nova_compute[182934]: 2026-01-31 06:43:37.835 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:43:37 np0005603500 nova_compute[182934]: 2026-01-31 06:43:37.836 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5767MB free_disk=73.21186828613281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:43:37 np0005603500 nova_compute[182934]: 2026-01-31 06:43:37.836 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:43:37 np0005603500 nova_compute[182934]: 2026-01-31 06:43:37.836 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:43:38 np0005603500 nova_compute[182934]: 2026-01-31 06:43:38.759 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:38 np0005603500 nova_compute[182934]: 2026-01-31 06:43:38.991 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:43:38 np0005603500 nova_compute[182934]: 2026-01-31 06:43:38.992 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:43:39 np0005603500 nova_compute[182934]: 2026-01-31 06:43:39.012 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:43:39 np0005603500 podman[217062]: 2026-01-31 06:43:39.139236991 +0000 UTC m=+0.055076783 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.7, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, managed_by=edpm_ansible, name=ubi9/ubi-minimal, release=1769056855, vcs-type=git, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 31 01:43:39 np0005603500 podman[217061]: 2026-01-31 06:43:39.141702608 +0000 UTC m=+0.063014411 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 01:43:39 np0005603500 nova_compute[182934]: 2026-01-31 06:43:39.534 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:43:40 np0005603500 nova_compute[182934]: 2026-01-31 06:43:40.069 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:43:40 np0005603500 nova_compute[182934]: 2026-01-31 06:43:40.070 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:43:40 np0005603500 nova_compute[182934]: 2026-01-31 06:43:40.944 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:41 np0005603500 nova_compute[182934]: 2026-01-31 06:43:41.071 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:43:41 np0005603500 nova_compute[182934]: 2026-01-31 06:43:41.071 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:43:41 np0005603500 nova_compute[182934]: 2026-01-31 06:43:41.072 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:43:41 np0005603500 nova_compute[182934]: 2026-01-31 06:43:41.072 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:43:41 np0005603500 nova_compute[182934]: 2026-01-31 06:43:41.072 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:43:42 np0005603500 nova_compute[182934]: 2026-01-31 06:43:42.144 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:43:42 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:42.956 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:43:43 np0005603500 nova_compute[182934]: 2026-01-31 06:43:43.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:43:43 np0005603500 nova_compute[182934]: 2026-01-31 06:43:43.760 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:45 np0005603500 nova_compute[182934]: 2026-01-31 06:43:45.947 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:47 np0005603500 podman[217108]: 2026-01-31 06:43:47.126369866 +0000 UTC m=+0.046514767 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 31 01:43:47 np0005603500 podman[217109]: 2026-01-31 06:43:47.12647898 +0000 UTC m=+0.043317389 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 31 01:43:48 np0005603500 nova_compute[182934]: 2026-01-31 06:43:48.761 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:48 np0005603500 nova_compute[182934]: 2026-01-31 06:43:48.843 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "98648c56-4605-4545-b2f5-a13857f888d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:43:48 np0005603500 nova_compute[182934]: 2026-01-31 06:43:48.844 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:43:49 np0005603500 nova_compute[182934]: 2026-01-31 06:43:49.412 182938 DEBUG nova.compute.manager [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Jan 31 01:43:49 np0005603500 nova_compute[182934]: 2026-01-31 06:43:49.958 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:43:49 np0005603500 nova_compute[182934]: 2026-01-31 06:43:49.959 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:43:49 np0005603500 nova_compute[182934]: 2026-01-31 06:43:49.968 182938 DEBUG nova.virt.hardware [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Jan 31 01:43:49 np0005603500 nova_compute[182934]: 2026-01-31 06:43:49.968 182938 INFO nova.compute.claims [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Claim successful on node compute-0.ctlplane.example.com
Jan 31 01:43:50 np0005603500 nova_compute[182934]: 2026-01-31 06:43:50.950 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:51 np0005603500 nova_compute[182934]: 2026-01-31 06:43:51.046 182938 DEBUG nova.compute.provider_tree [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:43:51 np0005603500 nova_compute[182934]: 2026-01-31 06:43:51.581 182938 DEBUG nova.scheduler.client.report [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:43:52 np0005603500 nova_compute[182934]: 2026-01-31 06:43:52.094 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:43:52 np0005603500 nova_compute[182934]: 2026-01-31 06:43:52.095 182938 DEBUG nova.compute.manager [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Jan 31 01:43:52 np0005603500 nova_compute[182934]: 2026-01-31 06:43:52.734 182938 DEBUG nova.compute.manager [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Jan 31 01:43:52 np0005603500 nova_compute[182934]: 2026-01-31 06:43:52.735 182938 DEBUG nova.network.neutron [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Jan 31 01:43:53 np0005603500 nova_compute[182934]: 2026-01-31 06:43:53.248 182938 INFO nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 31 01:43:53 np0005603500 nova_compute[182934]: 2026-01-31 06:43:53.547 182938 DEBUG nova.policy [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '829310cd8381494e96216dba067ff8d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Jan 31 01:43:53 np0005603500 nova_compute[182934]: 2026-01-31 06:43:53.756 182938 DEBUG nova.compute.manager [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Jan 31 01:43:53 np0005603500 nova_compute[182934]: 2026-01-31 06:43:53.764 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:54 np0005603500 ovn_controller[95398]: 2026-01-31T06:43:54Z|00136|memory_trim|INFO|Detected inactivity (last active 30013 ms ago): trimming memory
Jan 31 01:43:54 np0005603500 nova_compute[182934]: 2026-01-31 06:43:54.787 182938 DEBUG nova.compute.manager [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Jan 31 01:43:54 np0005603500 nova_compute[182934]: 2026-01-31 06:43:54.789 182938 DEBUG nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Jan 31 01:43:54 np0005603500 nova_compute[182934]: 2026-01-31 06:43:54.790 182938 INFO nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Creating image(s)
Jan 31 01:43:54 np0005603500 nova_compute[182934]: 2026-01-31 06:43:54.791 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "/var/lib/nova/instances/98648c56-4605-4545-b2f5-a13857f888d4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:43:54 np0005603500 nova_compute[182934]: 2026-01-31 06:43:54.791 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/98648c56-4605-4545-b2f5-a13857f888d4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:43:54 np0005603500 nova_compute[182934]: 2026-01-31 06:43:54.792 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/98648c56-4605-4545-b2f5-a13857f888d4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:43:54 np0005603500 nova_compute[182934]: 2026-01-31 06:43:54.794 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:43:54 np0005603500 nova_compute[182934]: 2026-01-31 06:43:54.802 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:43:54 np0005603500 nova_compute[182934]: 2026-01-31 06:43:54.805 182938 DEBUG oslo_concurrency.processutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:43:54 np0005603500 nova_compute[182934]: 2026-01-31 06:43:54.854 182938 DEBUG oslo_concurrency.processutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:43:54 np0005603500 nova_compute[182934]: 2026-01-31 06:43:54.855 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "d9035e96dc857b84194c2a2b496d294827e2de39" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:43:54 np0005603500 nova_compute[182934]: 2026-01-31 06:43:54.855 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:43:54 np0005603500 nova_compute[182934]: 2026-01-31 06:43:54.855 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:43:54 np0005603500 nova_compute[182934]: 2026-01-31 06:43:54.859 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:43:54 np0005603500 nova_compute[182934]: 2026-01-31 06:43:54.859 182938 DEBUG oslo_concurrency.processutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:43:54 np0005603500 nova_compute[182934]: 2026-01-31 06:43:54.907 182938 DEBUG oslo_concurrency.processutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:43:54 np0005603500 nova_compute[182934]: 2026-01-31 06:43:54.909 182938 DEBUG oslo_concurrency.processutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/98648c56-4605-4545-b2f5-a13857f888d4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:43:55 np0005603500 nova_compute[182934]: 2026-01-31 06:43:55.013 182938 DEBUG oslo_concurrency.processutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/98648c56-4605-4545-b2f5-a13857f888d4/disk 1073741824" returned: 0 in 0.104s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:43:55 np0005603500 nova_compute[182934]: 2026-01-31 06:43:55.014 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:43:55 np0005603500 nova_compute[182934]: 2026-01-31 06:43:55.014 182938 DEBUG oslo_concurrency.processutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:43:55 np0005603500 nova_compute[182934]: 2026-01-31 06:43:55.064 182938 DEBUG oslo_concurrency.processutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:43:55 np0005603500 nova_compute[182934]: 2026-01-31 06:43:55.065 182938 DEBUG nova.virt.disk.api [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Checking if we can resize image /var/lib/nova/instances/98648c56-4605-4545-b2f5-a13857f888d4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Jan 31 01:43:55 np0005603500 nova_compute[182934]: 2026-01-31 06:43:55.065 182938 DEBUG oslo_concurrency.processutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/98648c56-4605-4545-b2f5-a13857f888d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:43:55 np0005603500 nova_compute[182934]: 2026-01-31 06:43:55.108 182938 DEBUG oslo_concurrency.processutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/98648c56-4605-4545-b2f5-a13857f888d4/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:43:55 np0005603500 nova_compute[182934]: 2026-01-31 06:43:55.108 182938 DEBUG nova.virt.disk.api [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Cannot resize image /var/lib/nova/instances/98648c56-4605-4545-b2f5-a13857f888d4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Jan 31 01:43:55 np0005603500 nova_compute[182934]: 2026-01-31 06:43:55.109 182938 DEBUG nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Jan 31 01:43:55 np0005603500 nova_compute[182934]: 2026-01-31 06:43:55.109 182938 DEBUG nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Ensure instance console log exists: /var/lib/nova/instances/98648c56-4605-4545-b2f5-a13857f888d4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Jan 31 01:43:55 np0005603500 nova_compute[182934]: 2026-01-31 06:43:55.109 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:43:55 np0005603500 nova_compute[182934]: 2026-01-31 06:43:55.110 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:43:55 np0005603500 nova_compute[182934]: 2026-01-31 06:43:55.110 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:43:55 np0005603500 nova_compute[182934]: 2026-01-31 06:43:55.953 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:56.153 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:43:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:56.154 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:43:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:43:56.154 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:43:58 np0005603500 nova_compute[182934]: 2026-01-31 06:43:58.797 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:43:59 np0005603500 nova_compute[182934]: 2026-01-31 06:43:59.021 182938 DEBUG nova.network.neutron [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Successfully updated port: 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 31 01:43:59 np0005603500 nova_compute[182934]: 2026-01-31 06:43:59.228 182938 DEBUG nova.compute.manager [req-273944a1-16b2-471c-80b8-7c27debb8a2d req-235a0528-920b-4a05-aa3c-2999da9051bd 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Received event network-changed-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:43:59 np0005603500 nova_compute[182934]: 2026-01-31 06:43:59.229 182938 DEBUG nova.compute.manager [req-273944a1-16b2-471c-80b8-7c27debb8a2d req-235a0528-920b-4a05-aa3c-2999da9051bd 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Refreshing instance network info cache due to event network-changed-12d11bf4-59e7-416e-87e4-7fa01bcbfb01. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:43:59 np0005603500 nova_compute[182934]: 2026-01-31 06:43:59.229 182938 DEBUG oslo_concurrency.lockutils [req-273944a1-16b2-471c-80b8-7c27debb8a2d req-235a0528-920b-4a05-aa3c-2999da9051bd 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-98648c56-4605-4545-b2f5-a13857f888d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:43:59 np0005603500 nova_compute[182934]: 2026-01-31 06:43:59.229 182938 DEBUG oslo_concurrency.lockutils [req-273944a1-16b2-471c-80b8-7c27debb8a2d req-235a0528-920b-4a05-aa3c-2999da9051bd 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-98648c56-4605-4545-b2f5-a13857f888d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:43:59 np0005603500 nova_compute[182934]: 2026-01-31 06:43:59.229 182938 DEBUG nova.network.neutron [req-273944a1-16b2-471c-80b8-7c27debb8a2d req-235a0528-920b-4a05-aa3c-2999da9051bd 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Refreshing network info cache for port 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:43:59 np0005603500 nova_compute[182934]: 2026-01-31 06:43:59.531 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "refresh_cache-98648c56-4605-4545-b2f5-a13857f888d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:44:00 np0005603500 nova_compute[182934]: 2026-01-31 06:44:00.096 182938 DEBUG nova.network.neutron [req-273944a1-16b2-471c-80b8-7c27debb8a2d req-235a0528-920b-4a05-aa3c-2999da9051bd 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:44:00 np0005603500 nova_compute[182934]: 2026-01-31 06:44:00.954 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:01 np0005603500 podman[217167]: 2026-01-31 06:44:01.132422139 +0000 UTC m=+0.051067730 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:44:01 np0005603500 podman[217168]: 2026-01-31 06:44:01.160501582 +0000 UTC m=+0.076287594 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 31 01:44:01 np0005603500 nova_compute[182934]: 2026-01-31 06:44:01.895 182938 DEBUG nova.network.neutron [req-273944a1-16b2-471c-80b8-7c27debb8a2d req-235a0528-920b-4a05-aa3c-2999da9051bd 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:44:02 np0005603500 nova_compute[182934]: 2026-01-31 06:44:02.430 182938 DEBUG oslo_concurrency.lockutils [req-273944a1-16b2-471c-80b8-7c27debb8a2d req-235a0528-920b-4a05-aa3c-2999da9051bd 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-98648c56-4605-4545-b2f5-a13857f888d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:44:02 np0005603500 nova_compute[182934]: 2026-01-31 06:44:02.431 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquired lock "refresh_cache-98648c56-4605-4545-b2f5-a13857f888d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:44:02 np0005603500 nova_compute[182934]: 2026-01-31 06:44:02.431 182938 DEBUG nova.network.neutron [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Jan 31 01:44:03 np0005603500 nova_compute[182934]: 2026-01-31 06:44:03.799 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:03 np0005603500 nova_compute[182934]: 2026-01-31 06:44:03.905 182938 DEBUG nova.network.neutron [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:44:05 np0005603500 nova_compute[182934]: 2026-01-31 06:44:05.957 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.057 182938 DEBUG nova.network.neutron [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Updating instance_info_cache with network_info: [{"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.568 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Releasing lock "refresh_cache-98648c56-4605-4545-b2f5-a13857f888d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.569 182938 DEBUG nova.compute.manager [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Instance network_info: |[{"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.571 182938 DEBUG nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Start _get_guest_xml network_info=[{"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': '9f613975-b701-42a0-9b35-7d5c4a2cb7f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.576 182938 WARNING nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.577 182938 DEBUG nova.virt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-1263621311', uuid='98648c56-4605-4545-b2f5-a13857f888d4'), owner=OwnerMeta(userid='dddc34b0385a49a5bd9bf081ed29e9fd', username='tempest-TestNetworkBasicOps-1355800406-project-member', projectid='829310cd8381494e96216dba067ff8d3', projectname='tempest-TestNetworkBasicOps-1355800406'), image=ImageMeta(id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1769841847.577704) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.587 182938 DEBUG nova.virt.libvirt.host [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.587 182938 DEBUG nova.virt.libvirt.host [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.592 182938 DEBUG nova.virt.libvirt.host [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.592 182938 DEBUG nova.virt.libvirt.host [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.593 182938 DEBUG nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.593 182938 DEBUG nova.virt.hardware [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T06:29:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9956992e-a3ca-497f-9747-3ae270e07def',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.593 182938 DEBUG nova.virt.hardware [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.594 182938 DEBUG nova.virt.hardware [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.594 182938 DEBUG nova.virt.hardware [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.594 182938 DEBUG nova.virt.hardware [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.594 182938 DEBUG nova.virt.hardware [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.594 182938 DEBUG nova.virt.hardware [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.595 182938 DEBUG nova.virt.hardware [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.595 182938 DEBUG nova.virt.hardware [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.595 182938 DEBUG nova.virt.hardware [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.595 182938 DEBUG nova.virt.hardware [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.598 182938 DEBUG nova.virt.libvirt.vif [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:43:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1263621311',display_name='tempest-TestNetworkBasicOps-server-1263621311',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1263621311',id=9,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIWzoCyzy6rwnQhO3InA9hw22JADvyG6V9PGTxb2MySWYyZ71gxeusEH5sNLPgLCF/0poMpwDJl50ujCbvhZjVQUjA1bAsfFoN5PW+zrykCDIzKWzIA6twPJEoUDosfmHg==',key_name='tempest-TestNetworkBasicOps-569241740',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-74z638w8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:43:53Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=98648c56-4605-4545-b2f5-a13857f888d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.598 182938 DEBUG nova.network.os_vif_util [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.599 182938 DEBUG nova.network.os_vif_util [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:86:f7,bridge_name='br-int',has_traffic_filtering=True,id=12d11bf4-59e7-416e-87e4-7fa01bcbfb01,network=Network(f945b6f3-ac5f-4949-a595-265a9c245851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap12d11bf4-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:44:07 np0005603500 nova_compute[182934]: 2026-01-31 06:44:07.600 182938 DEBUG nova.objects.instance [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 98648c56-4605-4545-b2f5-a13857f888d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.116 182938 DEBUG nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] End _get_guest_xml xml=<domain type="kvm">
Jan 31 01:44:08 np0005603500 nova_compute[182934]:  <uuid>98648c56-4605-4545-b2f5-a13857f888d4</uuid>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:  <name>instance-00000009</name>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:  <memory>131072</memory>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:  <vcpu>1</vcpu>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <nova:name>tempest-TestNetworkBasicOps-server-1263621311</nova:name>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <nova:creationTime>2026-01-31 06:44:07</nova:creationTime>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <nova:flavor name="m1.nano">
Jan 31 01:44:08 np0005603500 nova_compute[182934]:        <nova:memory>128</nova:memory>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:        <nova:disk>1</nova:disk>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:        <nova:swap>0</nova:swap>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:        <nova:vcpus>1</nova:vcpus>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      </nova:flavor>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <nova:owner>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:        <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:        <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      </nova:owner>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <nova:ports>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:        <nova:port uuid="12d11bf4-59e7-416e-87e4-7fa01bcbfb01">
Jan 31 01:44:08 np0005603500 nova_compute[182934]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:        </nova:port>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      </nova:ports>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    </nova:instance>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:  <sysinfo type="smbios">
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <entry name="manufacturer">RDO</entry>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <entry name="product">OpenStack Compute</entry>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <entry name="serial">98648c56-4605-4545-b2f5-a13857f888d4</entry>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <entry name="uuid">98648c56-4605-4545-b2f5-a13857f888d4</entry>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <entry name="family">Virtual Machine</entry>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <boot dev="hd"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <smbios mode="sysinfo"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <vmcoreinfo/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:  <clock offset="utc">
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <timer name="hpet" present="no"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:  <cpu mode="host-model" match="exact">
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <disk type="file" device="disk">
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/98648c56-4605-4545-b2f5-a13857f888d4/disk"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <target dev="vda" bus="virtio"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <disk type="file" device="cdrom">
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <driver name="qemu" type="raw" cache="none"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/98648c56-4605-4545-b2f5-a13857f888d4/disk.config"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <target dev="sda" bus="sata"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <interface type="ethernet">
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <mac address="fa:16:3e:ac:86:f7"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <mtu size="1442"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <target dev="tap12d11bf4-59"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <serial type="pty">
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <log file="/var/lib/nova/instances/98648c56-4605-4545-b2f5-a13857f888d4/console.log" append="off"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <input type="tablet" bus="usb"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <rng model="virtio">
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <backend model="random">/dev/urandom</backend>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <controller type="usb" index="0"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    <memballoon model="virtio">
Jan 31 01:44:08 np0005603500 nova_compute[182934]:      <stats period="10"/>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:44:08 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:44:08 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:44:08 np0005603500 nova_compute[182934]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.117 182938 DEBUG nova.compute.manager [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Preparing to wait for external event network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.118 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "98648c56-4605-4545-b2f5-a13857f888d4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.118 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.118 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.119 182938 DEBUG nova.virt.libvirt.vif [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:43:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1263621311',display_name='tempest-TestNetworkBasicOps-server-1263621311',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1263621311',id=9,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIWzoCyzy6rwnQhO3InA9hw22JADvyG6V9PGTxb2MySWYyZ71gxeusEH5sNLPgLCF/0poMpwDJl50ujCbvhZjVQUjA1bAsfFoN5PW+zrykCDIzKWzIA6twPJEoUDosfmHg==',key_name='tempest-TestNetworkBasicOps-569241740',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-74z638w8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:43:53Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=98648c56-4605-4545-b2f5-a13857f888d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.119 182938 DEBUG nova.network.os_vif_util [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.120 182938 DEBUG nova.network.os_vif_util [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:86:f7,bridge_name='br-int',has_traffic_filtering=True,id=12d11bf4-59e7-416e-87e4-7fa01bcbfb01,network=Network(f945b6f3-ac5f-4949-a595-265a9c245851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap12d11bf4-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.120 182938 DEBUG os_vif [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:86:f7,bridge_name='br-int',has_traffic_filtering=True,id=12d11bf4-59e7-416e-87e4-7fa01bcbfb01,network=Network(f945b6f3-ac5f-4949-a595-265a9c245851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap12d11bf4-59') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.121 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.121 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.121 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.122 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.122 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '9aa4cca6-1bf6-5c63-9332-34cc3ac60574', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.124 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.125 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.128 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.129 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap12d11bf4-59, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.129 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap12d11bf4-59, col_values=(('qos', UUID('eef9d9df-8558-4435-b4bf-7b8424037353')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.130 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap12d11bf4-59, col_values=(('external_ids', {'iface-id': '12d11bf4-59e7-416e-87e4-7fa01bcbfb01', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ac:86:f7', 'vm-uuid': '98648c56-4605-4545-b2f5-a13857f888d4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.131 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:08 np0005603500 NetworkManager[55506]: <info>  [1769841848.1321] manager: (tap12d11bf4-59): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.133 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.137 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.139 182938 INFO os_vif [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:86:f7,bridge_name='br-int',has_traffic_filtering=True,id=12d11bf4-59e7-416e-87e4-7fa01bcbfb01,network=Network(f945b6f3-ac5f-4949-a595-265a9c245851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap12d11bf4-59')
Jan 31 01:44:08 np0005603500 nova_compute[182934]: 2026-01-31 06:44:08.801 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:09 np0005603500 nova_compute[182934]: 2026-01-31 06:44:09.738 182938 DEBUG nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:44:09 np0005603500 nova_compute[182934]: 2026-01-31 06:44:09.738 182938 DEBUG nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:44:09 np0005603500 nova_compute[182934]: 2026-01-31 06:44:09.738 182938 DEBUG nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No VIF found with MAC fa:16:3e:ac:86:f7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Jan 31 01:44:09 np0005603500 nova_compute[182934]: 2026-01-31 06:44:09.739 182938 INFO nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Using config drive
Jan 31 01:44:10 np0005603500 podman[217210]: 2026-01-31 06:44:10.125348892 +0000 UTC m=+0.046462266 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, name=ubi9/ubi-minimal, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.7)
Jan 31 01:44:10 np0005603500 podman[217209]: 2026-01-31 06:44:10.152850837 +0000 UTC m=+0.071655689 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 31 01:44:11 np0005603500 nova_compute[182934]: 2026-01-31 06:44:11.912 182938 INFO nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Creating config drive at /var/lib/nova/instances/98648c56-4605-4545-b2f5-a13857f888d4/disk.config
Jan 31 01:44:11 np0005603500 nova_compute[182934]: 2026-01-31 06:44:11.917 182938 DEBUG oslo_concurrency.processutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/98648c56-4605-4545-b2f5-a13857f888d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpbblui20l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:44:12 np0005603500 nova_compute[182934]: 2026-01-31 06:44:12.033 182938 DEBUG oslo_concurrency.processutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/98648c56-4605-4545-b2f5-a13857f888d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpbblui20l" returned: 0 in 0.116s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:44:12 np0005603500 kernel: tap12d11bf4-59: entered promiscuous mode
Jan 31 01:44:12 np0005603500 NetworkManager[55506]: <info>  [1769841852.0711] manager: (tap12d11bf4-59): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Jan 31 01:44:12 np0005603500 ovn_controller[95398]: 2026-01-31T06:44:12Z|00137|binding|INFO|Claiming lport 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 for this chassis.
Jan 31 01:44:12 np0005603500 ovn_controller[95398]: 2026-01-31T06:44:12Z|00138|binding|INFO|12d11bf4-59e7-416e-87e4-7fa01bcbfb01: Claiming fa:16:3e:ac:86:f7 10.100.0.14
Jan 31 01:44:12 np0005603500 nova_compute[182934]: 2026-01-31 06:44:12.072 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:12 np0005603500 ovn_controller[95398]: 2026-01-31T06:44:12Z|00139|binding|INFO|Setting lport 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 ovn-installed in OVS
Jan 31 01:44:12 np0005603500 nova_compute[182934]: 2026-01-31 06:44:12.078 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:12 np0005603500 nova_compute[182934]: 2026-01-31 06:44:12.080 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:12 np0005603500 systemd-machined[154375]: New machine qemu-9-instance-00000009.
Jan 31 01:44:12 np0005603500 systemd-udevd[217272]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:44:12 np0005603500 systemd[1]: Started Virtual Machine qemu-9-instance-00000009.
Jan 31 01:44:12 np0005603500 NetworkManager[55506]: <info>  [1769841852.1103] device (tap12d11bf4-59): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:44:12 np0005603500 NetworkManager[55506]: <info>  [1769841852.1111] device (tap12d11bf4-59): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.196 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:86:f7 10.100.0.14'], port_security=['fa:16:3e:ac:86:f7 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-603432797', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '98648c56-4605-4545-b2f5-a13857f888d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f945b6f3-ac5f-4949-a595-265a9c245851', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-603432797', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '10', 'neutron:security_group_ids': '859450e8-8706-4a8f-9018-d6c18f1c21cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.230'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=074897e1-acb2-45fc-95e8-d7347eaae0a8, chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=12d11bf4-59e7-416e-87e4-7fa01bcbfb01) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:44:12 np0005603500 ovn_controller[95398]: 2026-01-31T06:44:12Z|00140|binding|INFO|Setting lport 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 up in Southbound
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.200 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 in datapath f945b6f3-ac5f-4949-a595-265a9c245851 bound to our chassis
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.201 104644 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f945b6f3-ac5f-4949-a595-265a9c245851
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.210 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[2292ba96-1abc-49b3-aca6-8a8262121c5c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.210 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf945b6f3-a1 in ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.212 210946 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf945b6f3-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.212 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[f932d0cf-ec32-4d54-aa03-2e9de43dd18d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.214 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[bb5f0720-7623-4527-bee2-44b3c3e250db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.220 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[2b0f395c-3167-4391-babd-720e9bee750b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.229 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[036c28d8-aa25-4320-ad3b-e906b2793ee5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.252 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[d73a32e0-ec2c-405c-a765-82f1d1685d3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:12 np0005603500 NetworkManager[55506]: <info>  [1769841852.2582] manager: (tapf945b6f3-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/69)
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.258 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[d46cafef-f757-40ae-969f-976d48855085]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:12 np0005603500 systemd-udevd[217274]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.286 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[3f0c272e-8777-40b6-b89b-0b8ae615506b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.289 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[50f1c3ff-d3ef-43e3-bfc6-1439fcbe6f5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:12 np0005603500 NetworkManager[55506]: <info>  [1769841852.3070] device (tapf945b6f3-a0): carrier: link connected
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.311 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[630ee9bf-8cfd-4c8a-9934-654c6f219ead]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.322 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[f39378a2-1930-4810-8c69-908e55e9fefc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf945b6f3-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:fd:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416360, 'reachable_time': 40713, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217305, 'error': None, 'target': 'ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.334 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[079f9520-9a46-43e0-8d7d-e4f951747354]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9b:fd9c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 416360, 'tstamp': 416360}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217306, 'error': None, 'target': 'ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.346 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[47c12a00-64fa-4717-93da-a5a64619a58e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf945b6f3-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:fd:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416360, 'reachable_time': 40713, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217307, 'error': None, 'target': 'ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.370 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[d76411f1-6e85-4992-89e5-f9b32d4e77fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.412 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[879fa989-facb-4561-b15f-c60cf7f74b8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.413 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf945b6f3-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.413 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.413 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf945b6f3-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:44:12 np0005603500 kernel: tapf945b6f3-a0: entered promiscuous mode
Jan 31 01:44:12 np0005603500 NetworkManager[55506]: <info>  [1769841852.4156] manager: (tapf945b6f3-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.417 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf945b6f3-a0, col_values=(('external_ids', {'iface-id': '1539f259-b2fb-49d9-b369-ffe8f622864d'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:44:12 np0005603500 nova_compute[182934]: 2026-01-31 06:44:12.418 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:12 np0005603500 ovn_controller[95398]: 2026-01-31T06:44:12Z|00141|binding|INFO|Releasing lport 1539f259-b2fb-49d9-b369-ffe8f622864d from this chassis (sb_readonly=0)
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.424 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[a6361716-1c65-4e27-8c48-0995bed96f23]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.425 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.425 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.425 104644 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for f945b6f3-ac5f-4949-a595-265a9c245851 disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.425 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.425 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[53e42e53-b4a6-4cbe-999a-0c1519f002cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.426 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.426 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[950fffd4-676c-4114-91f2-b2afb313ece6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.426 104644 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: global
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    log         /dev/log local0 debug
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    log-tag     haproxy-metadata-proxy-f945b6f3-ac5f-4949-a595-265a9c245851
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    user        root
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    group       root
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    maxconn     1024
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    pidfile     /var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    daemon
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: defaults
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    log global
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    mode http
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    option httplog
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    option dontlognull
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    option http-server-close
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    option forwardfor
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    retries                 3
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    timeout http-request    30s
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    timeout connect         30s
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    timeout client          32s
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    timeout server          32s
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    timeout http-keep-alive 30s
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: listen listener
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    bind 169.254.169.254:80
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]:    http-request add-header X-OVN-Network-ID f945b6f3-ac5f-4949-a595-265a9c245851
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 31 01:44:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:12.427 104644 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851', 'env', 'PROCESS_TAG=haproxy-f945b6f3-ac5f-4949-a595-265a9c245851', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f945b6f3-ac5f-4949-a595-265a9c245851.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Jan 31 01:44:12 np0005603500 podman[217338]: 2026-01-31 06:44:12.753377129 +0000 UTC m=+0.047025173 container create 4003cc61a3b5c7faf81f2ad13107c9f2c065215d7a299df05a9614b30eefd24d (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:44:12 np0005603500 systemd[1]: Started libpod-conmon-4003cc61a3b5c7faf81f2ad13107c9f2c065215d7a299df05a9614b30eefd24d.scope.
Jan 31 01:44:12 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:44:12 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8e65957f84186a4ec0daceb166b4c789cb68f97dd48e92414076aae799b024a3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 01:44:12 np0005603500 podman[217338]: 2026-01-31 06:44:12.818926168 +0000 UTC m=+0.112574232 container init 4003cc61a3b5c7faf81f2ad13107c9f2c065215d7a299df05a9614b30eefd24d (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 01:44:12 np0005603500 podman[217338]: 2026-01-31 06:44:12.726277046 +0000 UTC m=+0.019925100 image pull d52ce0b189025039ce86fc9564595bcce243e95c598f912f021ea09cd4116a16 quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:44:12 np0005603500 podman[217338]: 2026-01-31 06:44:12.825231523 +0000 UTC m=+0.118879567 container start 4003cc61a3b5c7faf81f2ad13107c9f2c065215d7a299df05a9614b30eefd24d (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 31 01:44:12 np0005603500 neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851[217353]: [NOTICE]   (217362) : New worker (217365) forked
Jan 31 01:44:12 np0005603500 neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851[217353]: [NOTICE]   (217362) : Loading success.
Jan 31 01:44:13 np0005603500 nova_compute[182934]: 2026-01-31 06:44:13.132 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:13 np0005603500 nova_compute[182934]: 2026-01-31 06:44:13.803 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:14 np0005603500 nova_compute[182934]: 2026-01-31 06:44:14.677 182938 DEBUG nova.compute.manager [req-eab44ec9-62ce-4dbe-b162-9f0efefd2b81 req-70382c2a-808b-4a25-b77a-158104312bb7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Received event network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:44:14 np0005603500 nova_compute[182934]: 2026-01-31 06:44:14.678 182938 DEBUG oslo_concurrency.lockutils [req-eab44ec9-62ce-4dbe-b162-9f0efefd2b81 req-70382c2a-808b-4a25-b77a-158104312bb7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "98648c56-4605-4545-b2f5-a13857f888d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:44:14 np0005603500 nova_compute[182934]: 2026-01-31 06:44:14.678 182938 DEBUG oslo_concurrency.lockutils [req-eab44ec9-62ce-4dbe-b162-9f0efefd2b81 req-70382c2a-808b-4a25-b77a-158104312bb7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:44:14 np0005603500 nova_compute[182934]: 2026-01-31 06:44:14.678 182938 DEBUG oslo_concurrency.lockutils [req-eab44ec9-62ce-4dbe-b162-9f0efefd2b81 req-70382c2a-808b-4a25-b77a-158104312bb7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:44:14 np0005603500 nova_compute[182934]: 2026-01-31 06:44:14.678 182938 DEBUG nova.compute.manager [req-eab44ec9-62ce-4dbe-b162-9f0efefd2b81 req-70382c2a-808b-4a25-b77a-158104312bb7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Processing event network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Jan 31 01:44:14 np0005603500 nova_compute[182934]: 2026-01-31 06:44:14.679 182938 DEBUG nova.compute.manager [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Jan 31 01:44:14 np0005603500 nova_compute[182934]: 2026-01-31 06:44:14.684 182938 DEBUG nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Jan 31 01:44:14 np0005603500 nova_compute[182934]: 2026-01-31 06:44:14.688 182938 INFO nova.virt.libvirt.driver [-] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Instance spawned successfully.
Jan 31 01:44:14 np0005603500 nova_compute[182934]: 2026-01-31 06:44:14.688 182938 DEBUG nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Jan 31 01:44:15 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:15.056 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:44:15 np0005603500 nova_compute[182934]: 2026-01-31 06:44:15.057 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:15 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:15.058 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:44:15 np0005603500 nova_compute[182934]: 2026-01-31 06:44:15.348 182938 DEBUG nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:44:15 np0005603500 nova_compute[182934]: 2026-01-31 06:44:15.348 182938 DEBUG nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:44:15 np0005603500 nova_compute[182934]: 2026-01-31 06:44:15.349 182938 DEBUG nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:44:15 np0005603500 nova_compute[182934]: 2026-01-31 06:44:15.349 182938 DEBUG nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:44:15 np0005603500 nova_compute[182934]: 2026-01-31 06:44:15.349 182938 DEBUG nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:44:15 np0005603500 nova_compute[182934]: 2026-01-31 06:44:15.350 182938 DEBUG nova.virt.libvirt.driver [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:44:15 np0005603500 nova_compute[182934]: 2026-01-31 06:44:15.918 182938 INFO nova.compute.manager [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Took 21.13 seconds to spawn the instance on the hypervisor.
Jan 31 01:44:15 np0005603500 nova_compute[182934]: 2026-01-31 06:44:15.919 182938 DEBUG nova.compute.manager [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Jan 31 01:44:16 np0005603500 nova_compute[182934]: 2026-01-31 06:44:16.995 182938 DEBUG nova.compute.manager [req-da61b020-7541-403b-957c-ca9a55957fe7 req-b7506a6d-1c8a-49dc-bed8-5203c8f5009c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Received event network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:44:16 np0005603500 nova_compute[182934]: 2026-01-31 06:44:16.996 182938 DEBUG oslo_concurrency.lockutils [req-da61b020-7541-403b-957c-ca9a55957fe7 req-b7506a6d-1c8a-49dc-bed8-5203c8f5009c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "98648c56-4605-4545-b2f5-a13857f888d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:44:16 np0005603500 nova_compute[182934]: 2026-01-31 06:44:16.996 182938 DEBUG oslo_concurrency.lockutils [req-da61b020-7541-403b-957c-ca9a55957fe7 req-b7506a6d-1c8a-49dc-bed8-5203c8f5009c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:44:16 np0005603500 nova_compute[182934]: 2026-01-31 06:44:16.996 182938 DEBUG oslo_concurrency.lockutils [req-da61b020-7541-403b-957c-ca9a55957fe7 req-b7506a6d-1c8a-49dc-bed8-5203c8f5009c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:44:16 np0005603500 nova_compute[182934]: 2026-01-31 06:44:16.996 182938 DEBUG nova.compute.manager [req-da61b020-7541-403b-957c-ca9a55957fe7 req-b7506a6d-1c8a-49dc-bed8-5203c8f5009c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] No waiting events found dispatching network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:44:16 np0005603500 nova_compute[182934]: 2026-01-31 06:44:16.996 182938 WARNING nova.compute.manager [req-da61b020-7541-403b-957c-ca9a55957fe7 req-b7506a6d-1c8a-49dc-bed8-5203c8f5009c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Received unexpected event network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 for instance with vm_state active and task_state None.
Jan 31 01:44:17 np0005603500 nova_compute[182934]: 2026-01-31 06:44:17.000 182938 INFO nova.compute.manager [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Took 27.07 seconds to build instance.
Jan 31 01:44:17 np0005603500 nova_compute[182934]: 2026-01-31 06:44:17.561 182938 DEBUG oslo_concurrency.lockutils [None req-c0547cc5-7223-46c1-af75-fafdabe28c27 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 28.717s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:44:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f199f43bca0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:17.988 16 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/servers/98648c56-4605-4545-b2f5-a13857f888d4 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}9de33c3c4c813c7413c734743528a34030291a616c281269e5092e293b0fad44" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:580
Jan 31 01:44:18 np0005603500 podman[217376]: 2026-01-31 06:44:18.130486006 +0000 UTC m=+0.046953511 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 01:44:18 np0005603500 podman[217377]: 2026-01-31 06:44:18.130750044 +0000 UTC m=+0.042828112 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 31 01:44:18 np0005603500 nova_compute[182934]: 2026-01-31 06:44:18.136 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:18 np0005603500 nova_compute[182934]: 2026-01-31 06:44:18.805 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.183 16 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1949 Content-Type: application/json Date: Sat, 31 Jan 2026 06:44:17 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-908ed960-bbff-4bd4-bdb5-4f7f8fa922cf x-openstack-request-id: req-908ed960-bbff-4bd4-bdb5-4f7f8fa922cf _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:621
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.183 16 DEBUG novaclient.v2.client [-] RESP BODY: {"server": {"id": "98648c56-4605-4545-b2f5-a13857f888d4", "name": "tempest-TestNetworkBasicOps-server-1263621311", "status": "ACTIVE", "tenant_id": "829310cd8381494e96216dba067ff8d3", "user_id": "dddc34b0385a49a5bd9bf081ed29e9fd", "metadata": {}, "hostId": "0c6cfdf0627941602de15b61ce73eab761f53a1d9b2a5d92c8bbcc8e", "image": {"id": "9f613975-b701-42a0-9b35-7d5c4a2cb7f2", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/images/9f613975-b701-42a0-9b35-7d5c4a2cb7f2"}]}, "flavor": {"id": "9956992e-a3ca-497f-9747-3ae270e07def", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/9956992e-a3ca-497f-9747-3ae270e07def"}]}, "created": "2026-01-31T06:43:46Z", "updated": "2026-01-31T06:44:16Z", "addresses": {"tempest-network-smoke--786680245": [{"version": 4, "addr": "10.100.0.14", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:ac:86:f7"}, {"version": 4, "addr": "192.168.122.230", "OS-EXT-IPS:type": "floating", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:ac:86:f7"}]}, "accessIPv4": "", "accessIPv6": "", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/servers/98648c56-4605-4545-b2f5-a13857f888d4"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/servers/98648c56-4605-4545-b2f5-a13857f888d4"}], "OS-DCF:diskConfig": "MANUAL", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "True", "key_name": "tempest-TestNetworkBasicOps-569241740", "OS-SRV-USG:launched_at": "2026-01-31T06:44:15.000000", "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "default"}], "OS-EXT-SRV-ATTR:host": "compute-0.ctlplane.example.com", "OS-EXT-SRV-ATTR:instance_name": "instance-00000009", "OS-EXT-SRV-ATTR:hypervisor_hostname": "compute-0.ctlplane.example.com", "OS-EXT-STS:task_state": null, "OS-EXT-STS:vm_state": "active", "OS-EXT-STS:power_state": 1, "os-extended-volumes:volumes_attached": []}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:656
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.183 16 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/servers/98648c56-4605-4545-b2f5-a13857f888d4 used request id req-908ed960-bbff-4bd4-bdb5-4f7f8fa922cf request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:1081
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.184 16 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '98648c56-4605-4545-b2f5-a13857f888d4', 'name': 'tempest-TestNetworkBasicOps-server-1263621311', 'flavor': {'id': '9956992e-a3ca-497f-9747-3ae270e07def', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9f613975-b701-42a0-9b35-7d5c4a2cb7f2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000009', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '829310cd8381494e96216dba067ff8d3', 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'hostId': '0c6cfdf0627941602de15b61ce73eab761f53a1d9b2a5d92c8bbcc8e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:226
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.184 16 INFO ceilometer.polling.manager [-] Polling pollster disk.root.size in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.184 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43bbb0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.184 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43bbb0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.184 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.root.size heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.185 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.root.size in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.185 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f199f44dc10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.185 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.185 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44dca0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.185 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44dca0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.185 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.rate heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.186 16 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.186 16 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1263621311>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1263621311>]
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.186 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f199f43b700>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.186 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.186 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b610>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.186 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.root.size (2026-01-31T06:44:19.184903) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.186 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b610>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.186 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.187 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.rate (2026-01-31T06:44:19.185885) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.187 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.bytes (2026-01-31T06:44:19.186958) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.208 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.208 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.208 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.bytes in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.208 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f199f43b0d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.209 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.209 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b6d0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.209 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b6d0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.209 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.latency heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.209 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/disk.device.read.latency volume: 462079992 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.209 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/disk.device.read.latency volume: 2858769 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.210 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.latency in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.209 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.latency (2026-01-31T06:44:19.209258) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.210 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f199f44d4f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.210 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.210 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d580>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.210 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d580>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.210 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.error heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.211 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.error (2026-01-31T06:44:19.210700) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.213 16 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 98648c56-4605-4545-b2f5-a13857f888d4 / tap12d11bf4-59 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.213 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.213 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.error in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.213 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f199f44db50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.213 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.214 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44dbe0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.214 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44dbe0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.214 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.delta heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.214 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.214 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.214 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f199f44d220>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.214 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.215 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d130>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.215 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d130>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.215 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.215 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.215 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.215 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f199f43bdf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.215 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.216 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43be80>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.216 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43be80>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.216 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.latency heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.216 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.216 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.216 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.latency in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.216 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f199f43b3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.217 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.217 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b460>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.217 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b460>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.217 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.requests heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.217 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.217 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.217 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.requests in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.218 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f19a53f3b80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.218 16 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.218 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f436ee0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.218 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f436ee0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.218 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: cpu heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.218 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.delta (2026-01-31T06:44:19.214201) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.219 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes (2026-01-31T06:44:19.215250) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.219 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.latency (2026-01-31T06:44:19.216146) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.219 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.requests (2026-01-31T06:44:19.217352) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.219 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for cpu (2026-01-31T06:44:19.218298) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.231 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/cpu volume: 4310000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.231 16 INFO ceilometer.polling.manager [-] Finished polling pollster cpu in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.232 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f199f44d760>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.232 16 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.232 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d400>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.232 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d400>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.232 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: memory.usage heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.232 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.232 16 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 98648c56-4605-4545-b2f5-a13857f888d4: ceilometer.compute.pollsters.NoVolumeException
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.233 16 INFO ceilometer.polling.manager [-] Finished polling pollster memory.usage in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.233 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f199f44d2e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.233 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.233 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d3a0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.233 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d3a0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.233 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for memory.usage (2026-01-31T06:44:19.232415) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.233 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.drop heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.233 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.233 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.drop (2026-01-31T06:44:19.233468) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.234 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.234 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f199f44d6a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.234 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.234 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d850>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.234 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d850>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.234 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.rate heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.234 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.rate (2026-01-31T06:44:19.234517) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.234 16 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.234 16 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1263621311>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1263621311>]
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.235 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f199f44d160>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.235 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.235 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d0a0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.235 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d0a0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.235 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.error heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.235 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.235 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.error in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.236 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f199f43bbe0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.236 16 INFO ceilometer.polling.manager [-] Polling pollster disk.ephemeral.size in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.236 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.error (2026-01-31T06:44:19.235429) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.236 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b2e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.236 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b2e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.236 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.ephemeral.size heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.236 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.ephemeral.size (2026-01-31T06:44:19.236502) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.236 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.ephemeral.size in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.237 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f199f44d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.237 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.237 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d7f0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.237 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d7f0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.237 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.237 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.237 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets (2026-01-31T06:44:19.237362) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.237 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.238 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f199f44d940>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.238 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.238 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d1f0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.238 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d1f0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.238 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.delta heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.238 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.238 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.delta (2026-01-31T06:44:19.238368) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.238 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.239 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f199f44dcd0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.239 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.239 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44dd60>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.239 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44dd60>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.239 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.239 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.239 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.240 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f199f43baf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.240 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.240 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b400>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.240 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets (2026-01-31T06:44:19.239417) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.240 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b400>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.240 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.requests heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.240 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.requests (2026-01-31T06:44:19.240445) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.240 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.240 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.241 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.requests in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.241 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f199f44d040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.241 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.241 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44de20>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.241 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44de20>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.241 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.drop heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.241 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.242 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.drop in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.242 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f199f451250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.242 16 INFO ceilometer.polling.manager [-] Polling pollster power.state in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.242 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f4512e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.242 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f4512e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.242 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.drop (2026-01-31T06:44:19.241651) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.242 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: power.state heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.242 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/power.state volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.242 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for power.state (2026-01-31T06:44:19.242687) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.243 16 INFO ceilometer.polling.manager [-] Finished polling pollster power.state in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.243 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f199f436bb0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.243 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.243 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43ba60>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.243 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43ba60>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.243 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.allocation heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.243 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.allocation (2026-01-31T06:44:19.243689) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.252 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.252 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/disk.device.allocation volume: 499712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.252 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.allocation in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.253 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f199f43b550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.253 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.253 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b5e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.253 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b5e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.253 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.253 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.253 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.254 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.bytes in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.254 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f199f43b490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.254 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.bytes (2026-01-31T06:44:19.253471) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.254 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.255 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b520>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.255 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b520>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.255 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.usage heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.255 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.usage (2026-01-31T06:44:19.255224) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.255 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.255 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/disk.device.usage volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.256 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.usage in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.256 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f199f43b340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.256 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.256 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b9d0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.256 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b9d0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.256 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.capacity heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.256 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.256 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.capacity (2026-01-31T06:44:19.256527) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.256 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/disk.device.capacity volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.257 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.capacity in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.257 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f199f44d3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.257 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.257 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d5e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.257 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d5e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.257 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.257 16 DEBUG ceilometer.compute.pollsters [-] 98648c56-4605-4545-b2f5-a13857f888d4/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.257 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes (2026-01-31T06:44:19.257802) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:44:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:44:19.258 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes in the context of pollsters
Jan 31 01:44:21 np0005603500 nova_compute[182934]: 2026-01-31 06:44:21.984 182938 DEBUG oslo_concurrency.lockutils [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "98648c56-4605-4545-b2f5-a13857f888d4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:44:21 np0005603500 nova_compute[182934]: 2026-01-31 06:44:21.985 182938 DEBUG oslo_concurrency.lockutils [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:44:21 np0005603500 nova_compute[182934]: 2026-01-31 06:44:21.985 182938 DEBUG oslo_concurrency.lockutils [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "98648c56-4605-4545-b2f5-a13857f888d4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:44:21 np0005603500 nova_compute[182934]: 2026-01-31 06:44:21.985 182938 DEBUG oslo_concurrency.lockutils [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:44:21 np0005603500 nova_compute[182934]: 2026-01-31 06:44:21.986 182938 DEBUG oslo_concurrency.lockutils [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:44:21 np0005603500 nova_compute[182934]: 2026-01-31 06:44:21.987 182938 INFO nova.compute.manager [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Terminating instance
Jan 31 01:44:22 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:22.059 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:44:22 np0005603500 nova_compute[182934]: 2026-01-31 06:44:22.515 182938 DEBUG nova.compute.manager [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Jan 31 01:44:22 np0005603500 kernel: tap12d11bf4-59 (unregistering): left promiscuous mode
Jan 31 01:44:22 np0005603500 NetworkManager[55506]: <info>  [1769841862.5336] device (tap12d11bf4-59): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 01:44:22 np0005603500 ovn_controller[95398]: 2026-01-31T06:44:22Z|00142|binding|INFO|Releasing lport 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 from this chassis (sb_readonly=0)
Jan 31 01:44:22 np0005603500 ovn_controller[95398]: 2026-01-31T06:44:22Z|00143|binding|INFO|Setting lport 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 down in Southbound
Jan 31 01:44:22 np0005603500 ovn_controller[95398]: 2026-01-31T06:44:22Z|00144|binding|INFO|Removing iface tap12d11bf4-59 ovn-installed in OVS
Jan 31 01:44:22 np0005603500 nova_compute[182934]: 2026-01-31 06:44:22.540 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:22 np0005603500 nova_compute[182934]: 2026-01-31 06:44:22.541 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:22 np0005603500 nova_compute[182934]: 2026-01-31 06:44:22.549 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:22 np0005603500 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Deactivated successfully.
Jan 31 01:44:22 np0005603500 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Consumed 8.803s CPU time.
Jan 31 01:44:22 np0005603500 systemd-machined[154375]: Machine qemu-9-instance-00000009 terminated.
Jan 31 01:44:22 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:22.614 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:86:f7 10.100.0.14'], port_security=['fa:16:3e:ac:86:f7 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-603432797', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '98648c56-4605-4545-b2f5-a13857f888d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f945b6f3-ac5f-4949-a595-265a9c245851', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-603432797', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '11', 'neutron:security_group_ids': '859450e8-8706-4a8f-9018-d6c18f1c21cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.230', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=074897e1-acb2-45fc-95e8-d7347eaae0a8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=12d11bf4-59e7-416e-87e4-7fa01bcbfb01) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:44:22 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:22.616 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 in datapath f945b6f3-ac5f-4949-a595-265a9c245851 unbound from our chassis
Jan 31 01:44:22 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:22.617 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f945b6f3-ac5f-4949-a595-265a9c245851, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:44:22 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:22.618 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[8db3a02d-6a1d-4afe-85c3-45780d7d6635]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:22 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:22.619 104644 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851 namespace which is not needed anymore
Jan 31 01:44:22 np0005603500 neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851[217353]: [NOTICE]   (217362) : haproxy version is 2.8.14-c23fe91
Jan 31 01:44:22 np0005603500 neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851[217353]: [NOTICE]   (217362) : path to executable is /usr/sbin/haproxy
Jan 31 01:44:22 np0005603500 neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851[217353]: [WARNING]  (217362) : Exiting Master process...
Jan 31 01:44:22 np0005603500 podman[217443]: 2026-01-31 06:44:22.715867793 +0000 UTC m=+0.027032712 container kill 4003cc61a3b5c7faf81f2ad13107c9f2c065215d7a299df05a9614b30eefd24d (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:44:22 np0005603500 neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851[217353]: [ALERT]    (217362) : Current worker (217365) exited with code 143 (Terminated)
Jan 31 01:44:22 np0005603500 neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851[217353]: [WARNING]  (217362) : All workers exited. Exiting... (0)
Jan 31 01:44:22 np0005603500 systemd[1]: libpod-4003cc61a3b5c7faf81f2ad13107c9f2c065215d7a299df05a9614b30eefd24d.scope: Deactivated successfully.
Jan 31 01:44:22 np0005603500 kernel: tap12d11bf4-59: entered promiscuous mode
Jan 31 01:44:22 np0005603500 NetworkManager[55506]: <info>  [1769841862.7294] manager: (tap12d11bf4-59): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Jan 31 01:44:22 np0005603500 ovn_controller[95398]: 2026-01-31T06:44:22Z|00145|binding|INFO|Claiming lport 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 for this chassis.
Jan 31 01:44:22 np0005603500 ovn_controller[95398]: 2026-01-31T06:44:22Z|00146|binding|INFO|12d11bf4-59e7-416e-87e4-7fa01bcbfb01: Claiming fa:16:3e:ac:86:f7 10.100.0.14
Jan 31 01:44:22 np0005603500 nova_compute[182934]: 2026-01-31 06:44:22.731 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:22 np0005603500 systemd-udevd[217423]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:44:22 np0005603500 kernel: tap12d11bf4-59 (unregistering): left promiscuous mode
Jan 31 01:44:22 np0005603500 ovn_controller[95398]: 2026-01-31T06:44:22Z|00147|binding|INFO|Setting lport 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 ovn-installed in OVS
Jan 31 01:44:22 np0005603500 nova_compute[182934]: 2026-01-31 06:44:22.738 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:22 np0005603500 nova_compute[182934]: 2026-01-31 06:44:22.743 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:22 np0005603500 ovn_controller[95398]: 2026-01-31T06:44:22Z|00148|binding|INFO|Setting lport 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 up in Southbound
Jan 31 01:44:22 np0005603500 ovn_controller[95398]: 2026-01-31T06:44:22Z|00149|binding|INFO|Releasing lport 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 from this chassis (sb_readonly=1)
Jan 31 01:44:22 np0005603500 ovn_controller[95398]: 2026-01-31T06:44:22Z|00150|binding|INFO|Removing iface tap12d11bf4-59 ovn-installed in OVS
Jan 31 01:44:22 np0005603500 ovn_controller[95398]: 2026-01-31T06:44:22Z|00151|if_status|INFO|Not setting lport 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 down as sb is readonly
Jan 31 01:44:22 np0005603500 nova_compute[182934]: 2026-01-31 06:44:22.745 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:22 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:22.746 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:86:f7 10.100.0.14'], port_security=['fa:16:3e:ac:86:f7 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-603432797', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '98648c56-4605-4545-b2f5-a13857f888d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f945b6f3-ac5f-4949-a595-265a9c245851', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-603432797', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '11', 'neutron:security_group_ids': '859450e8-8706-4a8f-9018-d6c18f1c21cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.230', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=074897e1-acb2-45fc-95e8-d7347eaae0a8, chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=12d11bf4-59e7-416e-87e4-7fa01bcbfb01) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:44:22 np0005603500 nova_compute[182934]: 2026-01-31 06:44:22.749 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:22 np0005603500 ovn_controller[95398]: 2026-01-31T06:44:22Z|00152|binding|INFO|Releasing lport 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 from this chassis (sb_readonly=0)
Jan 31 01:44:22 np0005603500 ovn_controller[95398]: 2026-01-31T06:44:22Z|00153|binding|INFO|Setting lport 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 down in Southbound
Jan 31 01:44:22 np0005603500 podman[217458]: 2026-01-31 06:44:22.766473777 +0000 UTC m=+0.038476168 container died 4003cc61a3b5c7faf81f2ad13107c9f2c065215d7a299df05a9614b30eefd24d (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:44:22 np0005603500 nova_compute[182934]: 2026-01-31 06:44:22.780 182938 INFO nova.virt.libvirt.driver [-] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Instance destroyed successfully.
Jan 31 01:44:22 np0005603500 nova_compute[182934]: 2026-01-31 06:44:22.780 182938 DEBUG nova.objects.instance [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'resources' on Instance uuid 98648c56-4605-4545-b2f5-a13857f888d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:44:22 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:22.805 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:86:f7 10.100.0.14'], port_security=['fa:16:3e:ac:86:f7 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-603432797', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '98648c56-4605-4545-b2f5-a13857f888d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f945b6f3-ac5f-4949-a595-265a9c245851', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-603432797', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '11', 'neutron:security_group_ids': '859450e8-8706-4a8f-9018-d6c18f1c21cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.230', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=074897e1-acb2-45fc-95e8-d7347eaae0a8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=12d11bf4-59e7-416e-87e4-7fa01bcbfb01) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:44:22 np0005603500 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4003cc61a3b5c7faf81f2ad13107c9f2c065215d7a299df05a9614b30eefd24d-userdata-shm.mount: Deactivated successfully.
Jan 31 01:44:22 np0005603500 systemd[1]: var-lib-containers-storage-overlay-8e65957f84186a4ec0daceb166b4c789cb68f97dd48e92414076aae799b024a3-merged.mount: Deactivated successfully.
Jan 31 01:44:22 np0005603500 podman[217458]: 2026-01-31 06:44:22.921052364 +0000 UTC m=+0.193054755 container cleanup 4003cc61a3b5c7faf81f2ad13107c9f2c065215d7a299df05a9614b30eefd24d (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 01:44:22 np0005603500 systemd[1]: libpod-conmon-4003cc61a3b5c7faf81f2ad13107c9f2c065215d7a299df05a9614b30eefd24d.scope: Deactivated successfully.
Jan 31 01:44:22 np0005603500 podman[217472]: 2026-01-31 06:44:22.963629978 +0000 UTC m=+0.204920664 container remove 4003cc61a3b5c7faf81f2ad13107c9f2c065215d7a299df05a9614b30eefd24d (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Jan 31 01:44:22 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:22.967 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[c63a6bbe-b74a-4e55-8244-83b105b80a23]: (4, ("Sat Jan 31 06:44:22 AM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851 (4003cc61a3b5c7faf81f2ad13107c9f2c065215d7a299df05a9614b30eefd24d)\n4003cc61a3b5c7faf81f2ad13107c9f2c065215d7a299df05a9614b30eefd24d\nSat Jan 31 06:44:22 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851 (4003cc61a3b5c7faf81f2ad13107c9f2c065215d7a299df05a9614b30eefd24d)\n4003cc61a3b5c7faf81f2ad13107c9f2c065215d7a299df05a9614b30eefd24d\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:22 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:22.970 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[78466a4f-b5ae-4f8a-845b-c1cedc72409f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:22 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:22.970 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f945b6f3-ac5f-4949-a595-265a9c245851.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:44:22 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:22.971 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[6a1c1c6d-79ba-4cc4-965f-94c837722a94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:22 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:22.972 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf945b6f3-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:44:22 np0005603500 nova_compute[182934]: 2026-01-31 06:44:22.975 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:22 np0005603500 kernel: tapf945b6f3-a0: left promiscuous mode
Jan 31 01:44:22 np0005603500 nova_compute[182934]: 2026-01-31 06:44:22.987 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:22 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:22.990 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[92cf53cf-6752-4847-b501-3ff4f731efd8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:23 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:23.004 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[e97a0468-2e10-421c-b95d-9ecff982088f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:23 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:23.006 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[6abda970-2595-469a-9d29-0879cd75757e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:23 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:23.023 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[bb30cb63-6fd6-445b-9cea-106d8d2fd3e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416354, 'reachable_time': 23815, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217505, 'error': None, 'target': 'ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:23 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:23.029 105168 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f945b6f3-ac5f-4949-a595-265a9c245851 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 31 01:44:23 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:23.029 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[e04d9bc5-e0fc-4fec-b9fb-b04eb7e1cc8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:23 np0005603500 systemd[1]: run-netns-ovnmeta\x2df945b6f3\x2dac5f\x2d4949\x2da595\x2d265a9c245851.mount: Deactivated successfully.
Jan 31 01:44:23 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:23.030 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 in datapath f945b6f3-ac5f-4949-a595-265a9c245851 unbound from our chassis
Jan 31 01:44:23 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:23.031 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f945b6f3-ac5f-4949-a595-265a9c245851, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:44:23 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:23.032 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[5d3b637e-997a-49cd-a32d-8da757b50c13]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:23 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:23.033 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 12d11bf4-59e7-416e-87e4-7fa01bcbfb01 in datapath f945b6f3-ac5f-4949-a595-265a9c245851 unbound from our chassis
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.033 182938 DEBUG nova.compute.manager [req-fe55d3c7-c279-4796-843d-5b40a664e2c6 req-32249dac-28a3-4bbd-b05e-e5027a639845 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Received event network-vif-unplugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.033 182938 DEBUG oslo_concurrency.lockutils [req-fe55d3c7-c279-4796-843d-5b40a664e2c6 req-32249dac-28a3-4bbd-b05e-e5027a639845 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "98648c56-4605-4545-b2f5-a13857f888d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:44:23 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:23.034 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f945b6f3-ac5f-4949-a595-265a9c245851, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.034 182938 DEBUG oslo_concurrency.lockutils [req-fe55d3c7-c279-4796-843d-5b40a664e2c6 req-32249dac-28a3-4bbd-b05e-e5027a639845 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.034 182938 DEBUG oslo_concurrency.lockutils [req-fe55d3c7-c279-4796-843d-5b40a664e2c6 req-32249dac-28a3-4bbd-b05e-e5027a639845 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.034 182938 DEBUG nova.compute.manager [req-fe55d3c7-c279-4796-843d-5b40a664e2c6 req-32249dac-28a3-4bbd-b05e-e5027a639845 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] No waiting events found dispatching network-vif-unplugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.034 182938 DEBUG nova.compute.manager [req-fe55d3c7-c279-4796-843d-5b40a664e2c6 req-32249dac-28a3-4bbd-b05e-e5027a639845 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Received event network-vif-unplugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Jan 31 01:44:23 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:23.034 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[c0326dac-0785-481f-8d89-a36a8b210fa8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.137 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.295 182938 DEBUG nova.virt.libvirt.vif [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:43:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1263621311',display_name='tempest-TestNetworkBasicOps-server-1263621311',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1263621311',id=9,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIWzoCyzy6rwnQhO3InA9hw22JADvyG6V9PGTxb2MySWYyZ71gxeusEH5sNLPgLCF/0poMpwDJl50ujCbvhZjVQUjA1bAsfFoN5PW+zrykCDIzKWzIA6twPJEoUDosfmHg==',key_name='tempest-TestNetworkBasicOps-569241740',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:44:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-74z638w8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:44:16Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=98648c56-4605-4545-b2f5-a13857f888d4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.295 182938 DEBUG nova.network.os_vif_util [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "address": "fa:16:3e:ac:86:f7", "network": {"id": "f945b6f3-ac5f-4949-a595-265a9c245851", "bridge": "br-int", "label": "tempest-network-smoke--786680245", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d11bf4-59", "ovs_interfaceid": "12d11bf4-59e7-416e-87e4-7fa01bcbfb01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.296 182938 DEBUG nova.network.os_vif_util [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:86:f7,bridge_name='br-int',has_traffic_filtering=True,id=12d11bf4-59e7-416e-87e4-7fa01bcbfb01,network=Network(f945b6f3-ac5f-4949-a595-265a9c245851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap12d11bf4-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.297 182938 DEBUG os_vif [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:86:f7,bridge_name='br-int',has_traffic_filtering=True,id=12d11bf4-59e7-416e-87e4-7fa01bcbfb01,network=Network(f945b6f3-ac5f-4949-a595-265a9c245851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap12d11bf4-59') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.299 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.299 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap12d11bf4-59, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.300 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.301 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.302 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.302 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=eef9d9df-8558-4435-b4bf-7b8424037353) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.303 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.303 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.305 182938 INFO os_vif [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:86:f7,bridge_name='br-int',has_traffic_filtering=True,id=12d11bf4-59e7-416e-87e4-7fa01bcbfb01,network=Network(f945b6f3-ac5f-4949-a595-265a9c245851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap12d11bf4-59')
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.306 182938 INFO nova.virt.libvirt.driver [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Deleting instance files /var/lib/nova/instances/98648c56-4605-4545-b2f5-a13857f888d4_del
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.306 182938 INFO nova.virt.libvirt.driver [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Deletion of /var/lib/nova/instances/98648c56-4605-4545-b2f5-a13857f888d4_del complete
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.806 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.858 182938 INFO nova.compute.manager [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Took 1.34 seconds to destroy the instance on the hypervisor.
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.858 182938 DEBUG oslo.service.backend.eventlet.loopingcall [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.859 182938 DEBUG nova.compute.manager [-] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Jan 31 01:44:23 np0005603500 nova_compute[182934]: 2026-01-31 06:44:23.859 182938 DEBUG nova.network.neutron [-] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.314 182938 DEBUG nova.compute.manager [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Received event network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.315 182938 DEBUG oslo_concurrency.lockutils [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "98648c56-4605-4545-b2f5-a13857f888d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.315 182938 DEBUG oslo_concurrency.lockutils [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.315 182938 DEBUG oslo_concurrency.lockutils [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.315 182938 DEBUG nova.compute.manager [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] No waiting events found dispatching network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.316 182938 WARNING nova.compute.manager [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Received unexpected event network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 for instance with vm_state active and task_state deleting.
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.316 182938 DEBUG nova.compute.manager [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Received event network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.316 182938 DEBUG oslo_concurrency.lockutils [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "98648c56-4605-4545-b2f5-a13857f888d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.316 182938 DEBUG oslo_concurrency.lockutils [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.317 182938 DEBUG oslo_concurrency.lockutils [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.317 182938 DEBUG nova.compute.manager [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] No waiting events found dispatching network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.317 182938 WARNING nova.compute.manager [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Received unexpected event network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 for instance with vm_state active and task_state deleting.
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.317 182938 DEBUG nova.compute.manager [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Received event network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.318 182938 DEBUG oslo_concurrency.lockutils [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "98648c56-4605-4545-b2f5-a13857f888d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.318 182938 DEBUG oslo_concurrency.lockutils [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.318 182938 DEBUG oslo_concurrency.lockutils [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.318 182938 DEBUG nova.compute.manager [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] No waiting events found dispatching network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.318 182938 WARNING nova.compute.manager [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Received unexpected event network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 for instance with vm_state active and task_state deleting.
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.319 182938 DEBUG nova.compute.manager [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Received event network-vif-unplugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.319 182938 DEBUG oslo_concurrency.lockutils [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "98648c56-4605-4545-b2f5-a13857f888d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.319 182938 DEBUG oslo_concurrency.lockutils [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.319 182938 DEBUG oslo_concurrency.lockutils [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.320 182938 DEBUG nova.compute.manager [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] No waiting events found dispatching network-vif-unplugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:44:25 np0005603500 nova_compute[182934]: 2026-01-31 06:44:25.320 182938 DEBUG nova.compute.manager [req-4c68174c-5638-4bdf-880b-cd786cc326c3 req-8b19f604-e418-4929-bffb-489a686221d0 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Received event network-vif-unplugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Jan 31 01:44:27 np0005603500 nova_compute[182934]: 2026-01-31 06:44:27.499 182938 DEBUG nova.network.neutron [-] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:44:27 np0005603500 nova_compute[182934]: 2026-01-31 06:44:27.624 182938 DEBUG nova.compute.manager [req-2ceaaa62-437b-480c-9eca-c85436e64230 req-6393ee24-9808-4bec-852c-ba4a1b63bf8c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Received event network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:44:27 np0005603500 nova_compute[182934]: 2026-01-31 06:44:27.624 182938 DEBUG oslo_concurrency.lockutils [req-2ceaaa62-437b-480c-9eca-c85436e64230 req-6393ee24-9808-4bec-852c-ba4a1b63bf8c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "98648c56-4605-4545-b2f5-a13857f888d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:44:27 np0005603500 nova_compute[182934]: 2026-01-31 06:44:27.625 182938 DEBUG oslo_concurrency.lockutils [req-2ceaaa62-437b-480c-9eca-c85436e64230 req-6393ee24-9808-4bec-852c-ba4a1b63bf8c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:44:27 np0005603500 nova_compute[182934]: 2026-01-31 06:44:27.625 182938 DEBUG oslo_concurrency.lockutils [req-2ceaaa62-437b-480c-9eca-c85436e64230 req-6393ee24-9808-4bec-852c-ba4a1b63bf8c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:44:27 np0005603500 nova_compute[182934]: 2026-01-31 06:44:27.625 182938 DEBUG nova.compute.manager [req-2ceaaa62-437b-480c-9eca-c85436e64230 req-6393ee24-9808-4bec-852c-ba4a1b63bf8c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] No waiting events found dispatching network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:44:27 np0005603500 nova_compute[182934]: 2026-01-31 06:44:27.625 182938 WARNING nova.compute.manager [req-2ceaaa62-437b-480c-9eca-c85436e64230 req-6393ee24-9808-4bec-852c-ba4a1b63bf8c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Received unexpected event network-vif-plugged-12d11bf4-59e7-416e-87e4-7fa01bcbfb01 for instance with vm_state active and task_state deleting.
Jan 31 01:44:28 np0005603500 nova_compute[182934]: 2026-01-31 06:44:28.033 182938 INFO nova.compute.manager [-] [instance: 98648c56-4605-4545-b2f5-a13857f888d4] Took 4.17 seconds to deallocate network for instance.
Jan 31 01:44:28 np0005603500 nova_compute[182934]: 2026-01-31 06:44:28.303 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:28 np0005603500 nova_compute[182934]: 2026-01-31 06:44:28.567 182938 DEBUG oslo_concurrency.lockutils [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:44:28 np0005603500 nova_compute[182934]: 2026-01-31 06:44:28.567 182938 DEBUG oslo_concurrency.lockutils [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:44:28 np0005603500 nova_compute[182934]: 2026-01-31 06:44:28.622 182938 DEBUG nova.compute.provider_tree [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:44:28 np0005603500 nova_compute[182934]: 2026-01-31 06:44:28.807 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:29 np0005603500 nova_compute[182934]: 2026-01-31 06:44:29.133 182938 DEBUG nova.scheduler.client.report [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:44:29 np0005603500 nova_compute[182934]: 2026-01-31 06:44:29.651 182938 DEBUG oslo_concurrency.lockutils [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:44:29 np0005603500 nova_compute[182934]: 2026-01-31 06:44:29.706 182938 INFO nova.scheduler.client.report [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Deleted allocations for instance 98648c56-4605-4545-b2f5-a13857f888d4
Jan 31 01:44:30 np0005603500 nova_compute[182934]: 2026-01-31 06:44:30.757 182938 DEBUG oslo_concurrency.lockutils [None req-6d6b7246-e316-477e-8a1e-2a7d72962f29 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "98648c56-4605-4545-b2f5-a13857f888d4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.773s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:44:32 np0005603500 podman[217507]: 2026-01-31 06:44:32.140350857 +0000 UTC m=+0.052218405 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 31 01:44:32 np0005603500 podman[217506]: 2026-01-31 06:44:32.140407069 +0000 UTC m=+0.054585169 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:44:33 np0005603500 nova_compute[182934]: 2026-01-31 06:44:33.306 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:33 np0005603500 nova_compute[182934]: 2026-01-31 06:44:33.808 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:37 np0005603500 nova_compute[182934]: 2026-01-31 06:44:37.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:44:37 np0005603500 nova_compute[182934]: 2026-01-31 06:44:37.148 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:44:38 np0005603500 nova_compute[182934]: 2026-01-31 06:44:38.174 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:44:38 np0005603500 nova_compute[182934]: 2026-01-31 06:44:38.175 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:44:38 np0005603500 nova_compute[182934]: 2026-01-31 06:44:38.175 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:44:38 np0005603500 nova_compute[182934]: 2026-01-31 06:44:38.175 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:44:38 np0005603500 nova_compute[182934]: 2026-01-31 06:44:38.307 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:38 np0005603500 nova_compute[182934]: 2026-01-31 06:44:38.327 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:44:38 np0005603500 nova_compute[182934]: 2026-01-31 06:44:38.328 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5757MB free_disk=73.21175384521484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:44:38 np0005603500 nova_compute[182934]: 2026-01-31 06:44:38.328 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:44:38 np0005603500 nova_compute[182934]: 2026-01-31 06:44:38.328 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:44:38 np0005603500 nova_compute[182934]: 2026-01-31 06:44:38.810 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:39 np0005603500 nova_compute[182934]: 2026-01-31 06:44:39.502 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:44:39 np0005603500 nova_compute[182934]: 2026-01-31 06:44:39.503 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:44:39 np0005603500 nova_compute[182934]: 2026-01-31 06:44:39.524 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:44:40 np0005603500 nova_compute[182934]: 2026-01-31 06:44:40.058 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:44:40 np0005603500 nova_compute[182934]: 2026-01-31 06:44:40.636 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:44:40 np0005603500 nova_compute[182934]: 2026-01-31 06:44:40.636 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.308s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:44:41 np0005603500 podman[217552]: 2026-01-31 06:44:41.146433508 +0000 UTC m=+0.054848346 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, release=1769056855, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.7, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z)
Jan 31 01:44:41 np0005603500 podman[217551]: 2026-01-31 06:44:41.168337529 +0000 UTC m=+0.083348473 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:44:43 np0005603500 nova_compute[182934]: 2026-01-31 06:44:43.125 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:43 np0005603500 nova_compute[182934]: 2026-01-31 06:44:43.142 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:43 np0005603500 nova_compute[182934]: 2026-01-31 06:44:43.308 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:43 np0005603500 nova_compute[182934]: 2026-01-31 06:44:43.636 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:44:43 np0005603500 nova_compute[182934]: 2026-01-31 06:44:43.636 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:44:43 np0005603500 nova_compute[182934]: 2026-01-31 06:44:43.636 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:44:43 np0005603500 nova_compute[182934]: 2026-01-31 06:44:43.637 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:44:43 np0005603500 nova_compute[182934]: 2026-01-31 06:44:43.637 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:44:43 np0005603500 nova_compute[182934]: 2026-01-31 06:44:43.637 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:44:43 np0005603500 nova_compute[182934]: 2026-01-31 06:44:43.637 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:44:43 np0005603500 nova_compute[182934]: 2026-01-31 06:44:43.811 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:48 np0005603500 nova_compute[182934]: 2026-01-31 06:44:48.309 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:48 np0005603500 nova_compute[182934]: 2026-01-31 06:44:48.812 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:49 np0005603500 podman[217598]: 2026-01-31 06:44:49.131286151 +0000 UTC m=+0.046624280 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 31 01:44:49 np0005603500 podman[217597]: 2026-01-31 06:44:49.136284937 +0000 UTC m=+0.053754923 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 01:44:53 np0005603500 nova_compute[182934]: 2026-01-31 06:44:53.311 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:53 np0005603500 nova_compute[182934]: 2026-01-31 06:44:53.815 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:55.557 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:89:67 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b3bc875-2e1e-4a83-89b3-4fdc39da5699', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=56b76f4b-66cc-4f3c-b398-5988957638a2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a47c6cb4-b4f9-42e5-b250-0656effbfd20) old=Port_Binding(mac=['fa:16:3e:40:89:67'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b3bc875-2e1e-4a83-89b3-4fdc39da5699', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:44:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:55.558 104644 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a47c6cb4-b4f9-42e5-b250-0656effbfd20 in datapath 7b3bc875-2e1e-4a83-89b3-4fdc39da5699 updated
Jan 31 01:44:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:55.560 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b3bc875-2e1e-4a83-89b3-4fdc39da5699, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:44:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:55.561 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[fe5e4cd8-9c3e-4237-8853-96eb577bc3fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:44:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:56.169 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:44:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:56.170 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:44:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:44:56.170 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:44:58 np0005603500 nova_compute[182934]: 2026-01-31 06:44:58.470 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:44:58 np0005603500 nova_compute[182934]: 2026-01-31 06:44:58.816 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:03 np0005603500 podman[217641]: 2026-01-31 06:45:03.142350339 +0000 UTC m=+0.048921272 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 01:45:03 np0005603500 podman[217640]: 2026-01-31 06:45:03.142379829 +0000 UTC m=+0.051196733 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 31 01:45:03 np0005603500 nova_compute[182934]: 2026-01-31 06:45:03.476 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:03 np0005603500 nova_compute[182934]: 2026-01-31 06:45:03.817 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:08 np0005603500 nova_compute[182934]: 2026-01-31 06:45:08.478 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:08 np0005603500 nova_compute[182934]: 2026-01-31 06:45:08.818 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:12 np0005603500 podman[217681]: 2026-01-31 06:45:12.137152602 +0000 UTC m=+0.047989336 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, distribution-scope=public, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7, build-date=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, release=1769056855, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 31 01:45:12 np0005603500 podman[217680]: 2026-01-31 06:45:12.161788355 +0000 UTC m=+0.075957464 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 01:45:13 np0005603500 nova_compute[182934]: 2026-01-31 06:45:13.491 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:13 np0005603500 nova_compute[182934]: 2026-01-31 06:45:13.821 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:16 np0005603500 nova_compute[182934]: 2026-01-31 06:45:16.066 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "c2a731dc-3ec4-4802-af1a-46b70f875be5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:45:16 np0005603500 nova_compute[182934]: 2026-01-31 06:45:16.067 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "c2a731dc-3ec4-4802-af1a-46b70f875be5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:45:16 np0005603500 nova_compute[182934]: 2026-01-31 06:45:16.573 182938 DEBUG nova.compute.manager [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Jan 31 01:45:17 np0005603500 nova_compute[182934]: 2026-01-31 06:45:17.114 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:45:17 np0005603500 nova_compute[182934]: 2026-01-31 06:45:17.114 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:45:17 np0005603500 nova_compute[182934]: 2026-01-31 06:45:17.121 182938 DEBUG nova.virt.hardware [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Jan 31 01:45:17 np0005603500 nova_compute[182934]: 2026-01-31 06:45:17.121 182938 INFO nova.compute.claims [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Claim successful on node compute-0.ctlplane.example.com
Jan 31 01:45:18 np0005603500 nova_compute[182934]: 2026-01-31 06:45:18.193 182938 DEBUG nova.compute.provider_tree [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:45:18 np0005603500 nova_compute[182934]: 2026-01-31 06:45:18.493 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:18 np0005603500 nova_compute[182934]: 2026-01-31 06:45:18.712 182938 DEBUG nova.scheduler.client.report [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:45:18 np0005603500 nova_compute[182934]: 2026-01-31 06:45:18.821 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:19 np0005603500 nova_compute[182934]: 2026-01-31 06:45:19.220 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:45:19 np0005603500 nova_compute[182934]: 2026-01-31 06:45:19.221 182938 DEBUG nova.compute.manager [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Jan 31 01:45:19 np0005603500 nova_compute[182934]: 2026-01-31 06:45:19.737 182938 DEBUG nova.compute.manager [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Jan 31 01:45:19 np0005603500 nova_compute[182934]: 2026-01-31 06:45:19.737 182938 DEBUG nova.network.neutron [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Jan 31 01:45:20 np0005603500 podman[217728]: 2026-01-31 06:45:20.138420499 +0000 UTC m=+0.055997870 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 01:45:20 np0005603500 podman[217729]: 2026-01-31 06:45:20.144202613 +0000 UTC m=+0.053319695 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 31 01:45:20 np0005603500 nova_compute[182934]: 2026-01-31 06:45:20.251 182938 INFO nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 31 01:45:21 np0005603500 nova_compute[182934]: 2026-01-31 06:45:21.014 182938 DEBUG nova.policy [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '829310cd8381494e96216dba067ff8d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Jan 31 01:45:21 np0005603500 nova_compute[182934]: 2026-01-31 06:45:21.048 182938 DEBUG nova.compute.manager [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.103 182938 DEBUG nova.compute.manager [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.105 182938 DEBUG nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.106 182938 INFO nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Creating image(s)
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.107 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "/var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.107 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.109 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.110 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.118 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.120 182938 DEBUG oslo_concurrency.processutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.180 182938 DEBUG oslo_concurrency.processutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.182 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "d9035e96dc857b84194c2a2b496d294827e2de39" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.182 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.183 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.189 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.189 182938 DEBUG oslo_concurrency.processutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.253 182938 DEBUG oslo_concurrency.processutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.254 182938 DEBUG oslo_concurrency.processutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.835 182938 DEBUG oslo_concurrency.processutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5/disk 1073741824" returned: 0 in 0.581s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.837 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.654s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.837 182938 DEBUG oslo_concurrency.processutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.889 182938 DEBUG oslo_concurrency.processutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.890 182938 DEBUG nova.virt.disk.api [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Checking if we can resize image /var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.891 182938 DEBUG oslo_concurrency.processutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.948 182938 DEBUG oslo_concurrency.processutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.949 182938 DEBUG nova.virt.disk.api [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Cannot resize image /var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.949 182938 DEBUG nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.949 182938 DEBUG nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Ensure instance console log exists: /var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.950 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.950 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:45:22 np0005603500 nova_compute[182934]: 2026-01-31 06:45:22.950 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:45:23 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:23.321 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:45:23 np0005603500 nova_compute[182934]: 2026-01-31 06:45:23.322 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:23 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:23.323 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:45:23 np0005603500 nova_compute[182934]: 2026-01-31 06:45:23.494 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:23 np0005603500 nova_compute[182934]: 2026-01-31 06:45:23.824 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:24 np0005603500 nova_compute[182934]: 2026-01-31 06:45:24.122 182938 DEBUG nova.network.neutron [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Successfully created port: 1374531f-4a2c-49e9-81ba-5cc0c3c1e0af _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 31 01:45:25 np0005603500 nova_compute[182934]: 2026-01-31 06:45:25.249 182938 DEBUG nova.network.neutron [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Successfully updated port: 1374531f-4a2c-49e9-81ba-5cc0c3c1e0af _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 31 01:45:25 np0005603500 nova_compute[182934]: 2026-01-31 06:45:25.488 182938 DEBUG nova.compute.manager [req-8167c9af-7c9a-4239-a779-5cbae47aab0e req-ad220e8b-6ce2-4a33-9eef-f9a255b6688f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Received event network-changed-1374531f-4a2c-49e9-81ba-5cc0c3c1e0af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:45:25 np0005603500 nova_compute[182934]: 2026-01-31 06:45:25.488 182938 DEBUG nova.compute.manager [req-8167c9af-7c9a-4239-a779-5cbae47aab0e req-ad220e8b-6ce2-4a33-9eef-f9a255b6688f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Refreshing instance network info cache due to event network-changed-1374531f-4a2c-49e9-81ba-5cc0c3c1e0af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:45:25 np0005603500 nova_compute[182934]: 2026-01-31 06:45:25.488 182938 DEBUG oslo_concurrency.lockutils [req-8167c9af-7c9a-4239-a779-5cbae47aab0e req-ad220e8b-6ce2-4a33-9eef-f9a255b6688f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-c2a731dc-3ec4-4802-af1a-46b70f875be5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:45:25 np0005603500 nova_compute[182934]: 2026-01-31 06:45:25.489 182938 DEBUG oslo_concurrency.lockutils [req-8167c9af-7c9a-4239-a779-5cbae47aab0e req-ad220e8b-6ce2-4a33-9eef-f9a255b6688f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-c2a731dc-3ec4-4802-af1a-46b70f875be5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:45:25 np0005603500 nova_compute[182934]: 2026-01-31 06:45:25.489 182938 DEBUG nova.network.neutron [req-8167c9af-7c9a-4239-a779-5cbae47aab0e req-ad220e8b-6ce2-4a33-9eef-f9a255b6688f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Refreshing network info cache for port 1374531f-4a2c-49e9-81ba-5cc0c3c1e0af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:45:25 np0005603500 nova_compute[182934]: 2026-01-31 06:45:25.857 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "refresh_cache-c2a731dc-3ec4-4802-af1a-46b70f875be5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:45:26 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:26.327 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:45:26 np0005603500 nova_compute[182934]: 2026-01-31 06:45:26.390 182938 DEBUG nova.network.neutron [req-8167c9af-7c9a-4239-a779-5cbae47aab0e req-ad220e8b-6ce2-4a33-9eef-f9a255b6688f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:45:27 np0005603500 ovn_controller[95398]: 2026-01-31T06:45:27Z|00154|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 31 01:45:27 np0005603500 nova_compute[182934]: 2026-01-31 06:45:27.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:45:27 np0005603500 nova_compute[182934]: 2026-01-31 06:45:27.221 182938 DEBUG nova.network.neutron [req-8167c9af-7c9a-4239-a779-5cbae47aab0e req-ad220e8b-6ce2-4a33-9eef-f9a255b6688f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:45:27 np0005603500 nova_compute[182934]: 2026-01-31 06:45:27.727 182938 DEBUG oslo_concurrency.lockutils [req-8167c9af-7c9a-4239-a779-5cbae47aab0e req-ad220e8b-6ce2-4a33-9eef-f9a255b6688f 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-c2a731dc-3ec4-4802-af1a-46b70f875be5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:45:27 np0005603500 nova_compute[182934]: 2026-01-31 06:45:27.728 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquired lock "refresh_cache-c2a731dc-3ec4-4802-af1a-46b70f875be5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:45:27 np0005603500 nova_compute[182934]: 2026-01-31 06:45:27.728 182938 DEBUG nova.network.neutron [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Jan 31 01:45:28 np0005603500 nova_compute[182934]: 2026-01-31 06:45:28.497 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:28 np0005603500 nova_compute[182934]: 2026-01-31 06:45:28.826 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:28 np0005603500 nova_compute[182934]: 2026-01-31 06:45:28.996 182938 DEBUG nova.network.neutron [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:45:31 np0005603500 nova_compute[182934]: 2026-01-31 06:45:31.734 182938 DEBUG nova.network.neutron [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Updating instance_info_cache with network_info: [{"id": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "address": "fa:16:3e:96:58:25", "network": {"id": "7b3bc875-2e1e-4a83-89b3-4fdc39da5699", "bridge": "br-int", "label": "tempest-network-smoke--636953574", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1374531f-4a", "ovs_interfaceid": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.240 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Releasing lock "refresh_cache-c2a731dc-3ec4-4802-af1a-46b70f875be5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.241 182938 DEBUG nova.compute.manager [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Instance network_info: |[{"id": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "address": "fa:16:3e:96:58:25", "network": {"id": "7b3bc875-2e1e-4a83-89b3-4fdc39da5699", "bridge": "br-int", "label": "tempest-network-smoke--636953574", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1374531f-4a", "ovs_interfaceid": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.243 182938 DEBUG nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Start _get_guest_xml network_info=[{"id": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "address": "fa:16:3e:96:58:25", "network": {"id": "7b3bc875-2e1e-4a83-89b3-4fdc39da5699", "bridge": "br-int", "label": "tempest-network-smoke--636953574", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1374531f-4a", "ovs_interfaceid": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': '9f613975-b701-42a0-9b35-7d5c4a2cb7f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.247 182938 WARNING nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.248 182938 DEBUG nova.virt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-1168517125', uuid='c2a731dc-3ec4-4802-af1a-46b70f875be5'), owner=OwnerMeta(userid='dddc34b0385a49a5bd9bf081ed29e9fd', username='tempest-TestNetworkBasicOps-1355800406-project-member', projectid='829310cd8381494e96216dba067ff8d3', projectname='tempest-TestNetworkBasicOps-1355800406'), image=ImageMeta(id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "address": "fa:16:3e:96:58:25", "network": {"id": "7b3bc875-2e1e-4a83-89b3-4fdc39da5699", "bridge": "br-int", "label": "tempest-network-smoke--636953574", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1374531f-4a", "ovs_interfaceid": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1769841932.2480066) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.255 182938 DEBUG nova.virt.libvirt.host [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.256 182938 DEBUG nova.virt.libvirt.host [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.259 182938 DEBUG nova.virt.libvirt.host [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.259 182938 DEBUG nova.virt.libvirt.host [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.260 182938 DEBUG nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.260 182938 DEBUG nova.virt.hardware [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T06:29:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9956992e-a3ca-497f-9747-3ae270e07def',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.260 182938 DEBUG nova.virt.hardware [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.261 182938 DEBUG nova.virt.hardware [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.261 182938 DEBUG nova.virt.hardware [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.261 182938 DEBUG nova.virt.hardware [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.261 182938 DEBUG nova.virt.hardware [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.262 182938 DEBUG nova.virt.hardware [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.262 182938 DEBUG nova.virt.hardware [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.262 182938 DEBUG nova.virt.hardware [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.262 182938 DEBUG nova.virt.hardware [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.262 182938 DEBUG nova.virt.hardware [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.266 182938 DEBUG nova.virt.libvirt.vif [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:45:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1168517125',display_name='tempest-TestNetworkBasicOps-server-1168517125',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1168517125',id=10,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOFug/HTxsw4qAuOUyOVy7vltes/7jhsn7DdsGV7ihx5cjSPat4sI9CRbIVsiWEliMjZGBroUzDkAxbhSZJ+tDVF7b3joz0KfL7WOCtxFgAsUECyZ4vTDFkW3SYt6qm+nw==',key_name='tempest-TestNetworkBasicOps-1261257867',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-nqlio06b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:45:21Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=c2a731dc-3ec4-4802-af1a-46b70f875be5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "address": "fa:16:3e:96:58:25", "network": {"id": "7b3bc875-2e1e-4a83-89b3-4fdc39da5699", "bridge": "br-int", "label": "tempest-network-smoke--636953574", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1374531f-4a", "ovs_interfaceid": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.266 182938 DEBUG nova.network.os_vif_util [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "address": "fa:16:3e:96:58:25", "network": {"id": "7b3bc875-2e1e-4a83-89b3-4fdc39da5699", "bridge": "br-int", "label": "tempest-network-smoke--636953574", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1374531f-4a", "ovs_interfaceid": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.267 182938 DEBUG nova.network.os_vif_util [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:58:25,bridge_name='br-int',has_traffic_filtering=True,id=1374531f-4a2c-49e9-81ba-5cc0c3c1e0af,network=Network(7b3bc875-2e1e-4a83-89b3-4fdc39da5699),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1374531f-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.267 182938 DEBUG nova.objects.instance [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid c2a731dc-3ec4-4802-af1a-46b70f875be5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.774 182938 DEBUG nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] End _get_guest_xml xml=<domain type="kvm">
Jan 31 01:45:32 np0005603500 nova_compute[182934]:  <uuid>c2a731dc-3ec4-4802-af1a-46b70f875be5</uuid>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:  <name>instance-0000000a</name>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:  <memory>131072</memory>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:  <vcpu>1</vcpu>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <nova:name>tempest-TestNetworkBasicOps-server-1168517125</nova:name>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <nova:creationTime>2026-01-31 06:45:32</nova:creationTime>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <nova:flavor name="m1.nano">
Jan 31 01:45:32 np0005603500 nova_compute[182934]:        <nova:memory>128</nova:memory>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:        <nova:disk>1</nova:disk>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:        <nova:swap>0</nova:swap>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:        <nova:vcpus>1</nova:vcpus>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      </nova:flavor>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <nova:owner>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:        <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:        <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      </nova:owner>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <nova:ports>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:        <nova:port uuid="1374531f-4a2c-49e9-81ba-5cc0c3c1e0af">
Jan 31 01:45:32 np0005603500 nova_compute[182934]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:        </nova:port>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      </nova:ports>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    </nova:instance>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:  <sysinfo type="smbios">
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <entry name="manufacturer">RDO</entry>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <entry name="product">OpenStack Compute</entry>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <entry name="serial">c2a731dc-3ec4-4802-af1a-46b70f875be5</entry>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <entry name="uuid">c2a731dc-3ec4-4802-af1a-46b70f875be5</entry>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <entry name="family">Virtual Machine</entry>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <boot dev="hd"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <smbios mode="sysinfo"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <vmcoreinfo/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:  <clock offset="utc">
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <timer name="hpet" present="no"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:  <cpu mode="host-model" match="exact">
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <disk type="file" device="disk">
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5/disk"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <target dev="vda" bus="virtio"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <disk type="file" device="cdrom">
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <driver name="qemu" type="raw" cache="none"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5/disk.config"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <target dev="sda" bus="sata"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <interface type="ethernet">
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <mac address="fa:16:3e:96:58:25"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <mtu size="1442"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <target dev="tap1374531f-4a"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <serial type="pty">
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <log file="/var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5/console.log" append="off"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <input type="tablet" bus="usb"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <rng model="virtio">
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <backend model="random">/dev/urandom</backend>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <controller type="usb" index="0"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    <memballoon model="virtio">
Jan 31 01:45:32 np0005603500 nova_compute[182934]:      <stats period="10"/>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:45:32 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:45:32 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:45:32 np0005603500 nova_compute[182934]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.775 182938 DEBUG nova.compute.manager [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Preparing to wait for external event network-vif-plugged-1374531f-4a2c-49e9-81ba-5cc0c3c1e0af prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.775 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "c2a731dc-3ec4-4802-af1a-46b70f875be5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.775 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "c2a731dc-3ec4-4802-af1a-46b70f875be5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.775 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "c2a731dc-3ec4-4802-af1a-46b70f875be5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.776 182938 DEBUG nova.virt.libvirt.vif [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:45:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1168517125',display_name='tempest-TestNetworkBasicOps-server-1168517125',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1168517125',id=10,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOFug/HTxsw4qAuOUyOVy7vltes/7jhsn7DdsGV7ihx5cjSPat4sI9CRbIVsiWEliMjZGBroUzDkAxbhSZJ+tDVF7b3joz0KfL7WOCtxFgAsUECyZ4vTDFkW3SYt6qm+nw==',key_name='tempest-TestNetworkBasicOps-1261257867',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-nqlio06b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:45:21Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=c2a731dc-3ec4-4802-af1a-46b70f875be5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "address": "fa:16:3e:96:58:25", "network": {"id": "7b3bc875-2e1e-4a83-89b3-4fdc39da5699", "bridge": "br-int", "label": "tempest-network-smoke--636953574", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1374531f-4a", "ovs_interfaceid": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.776 182938 DEBUG nova.network.os_vif_util [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "address": "fa:16:3e:96:58:25", "network": {"id": "7b3bc875-2e1e-4a83-89b3-4fdc39da5699", "bridge": "br-int", "label": "tempest-network-smoke--636953574", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1374531f-4a", "ovs_interfaceid": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.777 182938 DEBUG nova.network.os_vif_util [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:58:25,bridge_name='br-int',has_traffic_filtering=True,id=1374531f-4a2c-49e9-81ba-5cc0c3c1e0af,network=Network(7b3bc875-2e1e-4a83-89b3-4fdc39da5699),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1374531f-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.777 182938 DEBUG os_vif [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:58:25,bridge_name='br-int',has_traffic_filtering=True,id=1374531f-4a2c-49e9-81ba-5cc0c3c1e0af,network=Network(7b3bc875-2e1e-4a83-89b3-4fdc39da5699),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1374531f-4a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.778 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.778 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.778 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.779 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.779 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'e784ca55-8191-5803-8d87-94e0020a0e0e', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.780 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.783 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.785 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.786 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1374531f-4a, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.786 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap1374531f-4a, col_values=(('qos', UUID('aa013439-aca3-455a-af6a-c13a4b480787')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.786 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap1374531f-4a, col_values=(('external_ids', {'iface-id': '1374531f-4a2c-49e9-81ba-5cc0c3c1e0af', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:96:58:25', 'vm-uuid': 'c2a731dc-3ec4-4802-af1a-46b70f875be5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.787 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:32 np0005603500 NetworkManager[55506]: <info>  [1769841932.7885] manager: (tap1374531f-4a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.790 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.792 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:32 np0005603500 nova_compute[182934]: 2026-01-31 06:45:32.793 182938 INFO os_vif [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:58:25,bridge_name='br-int',has_traffic_filtering=True,id=1374531f-4a2c-49e9-81ba-5cc0c3c1e0af,network=Network(7b3bc875-2e1e-4a83-89b3-4fdc39da5699),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1374531f-4a')
Jan 31 01:45:33 np0005603500 nova_compute[182934]: 2026-01-31 06:45:33.827 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:34 np0005603500 podman[217790]: 2026-01-31 06:45:34.132433184 +0000 UTC m=+0.045044902 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:45:34 np0005603500 podman[217791]: 2026-01-31 06:45:34.135250874 +0000 UTC m=+0.043260616 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 31 01:45:34 np0005603500 nova_compute[182934]: 2026-01-31 06:45:34.330 182938 DEBUG nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:45:34 np0005603500 nova_compute[182934]: 2026-01-31 06:45:34.330 182938 DEBUG nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:45:34 np0005603500 nova_compute[182934]: 2026-01-31 06:45:34.330 182938 DEBUG nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No VIF found with MAC fa:16:3e:96:58:25, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Jan 31 01:45:34 np0005603500 nova_compute[182934]: 2026-01-31 06:45:34.331 182938 INFO nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Using config drive
Jan 31 01:45:37 np0005603500 nova_compute[182934]: 2026-01-31 06:45:37.656 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:45:37 np0005603500 nova_compute[182934]: 2026-01-31 06:45:37.789 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:37 np0005603500 nova_compute[182934]: 2026-01-31 06:45:37.997 182938 INFO nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Creating config drive at /var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5/disk.config
Jan 31 01:45:38 np0005603500 nova_compute[182934]: 2026-01-31 06:45:38.001 182938 DEBUG oslo_concurrency.processutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpnw4dt9ag execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:45:38 np0005603500 nova_compute[182934]: 2026-01-31 06:45:38.167 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:45:38 np0005603500 nova_compute[182934]: 2026-01-31 06:45:38.168 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:45:38 np0005603500 nova_compute[182934]: 2026-01-31 06:45:38.168 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:45:38 np0005603500 nova_compute[182934]: 2026-01-31 06:45:38.168 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:45:38 np0005603500 nova_compute[182934]: 2026-01-31 06:45:38.831 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:39 np0005603500 nova_compute[182934]: 2026-01-31 06:45:39.206 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:45:39 np0005603500 nova_compute[182934]: 2026-01-31 06:45:39.263 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:45:39 np0005603500 nova_compute[182934]: 2026-01-31 06:45:39.264 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:45:39 np0005603500 nova_compute[182934]: 2026-01-31 06:45:39.309 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:45:39 np0005603500 nova_compute[182934]: 2026-01-31 06:45:39.424 182938 DEBUG oslo_concurrency.processutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpnw4dt9ag" returned: 0 in 1.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:45:39 np0005603500 nova_compute[182934]: 2026-01-31 06:45:39.469 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:45:39 np0005603500 nova_compute[182934]: 2026-01-31 06:45:39.471 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5775MB free_disk=73.21021270751953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:45:39 np0005603500 nova_compute[182934]: 2026-01-31 06:45:39.471 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:45:39 np0005603500 nova_compute[182934]: 2026-01-31 06:45:39.472 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:45:39 np0005603500 kernel: tap1374531f-4a: entered promiscuous mode
Jan 31 01:45:39 np0005603500 NetworkManager[55506]: <info>  [1769841939.4750] manager: (tap1374531f-4a): new Tun device (/org/freedesktop/NetworkManager/Devices/73)
Jan 31 01:45:39 np0005603500 ovn_controller[95398]: 2026-01-31T06:45:39Z|00155|binding|INFO|Claiming lport 1374531f-4a2c-49e9-81ba-5cc0c3c1e0af for this chassis.
Jan 31 01:45:39 np0005603500 ovn_controller[95398]: 2026-01-31T06:45:39Z|00156|binding|INFO|1374531f-4a2c-49e9-81ba-5cc0c3c1e0af: Claiming fa:16:3e:96:58:25 10.100.0.9
Jan 31 01:45:39 np0005603500 nova_compute[182934]: 2026-01-31 06:45:39.477 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:39 np0005603500 nova_compute[182934]: 2026-01-31 06:45:39.484 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:39 np0005603500 nova_compute[182934]: 2026-01-31 06:45:39.488 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.494 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:96:58:25 10.100.0.9'], port_security=['fa:16:3e:96:58:25 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'c2a731dc-3ec4-4802-af1a-46b70f875be5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b3bc875-2e1e-4a83-89b3-4fdc39da5699', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '252da2b4-f4b9-42c9-bb68-873b356ebced', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=56b76f4b-66cc-4f3c-b398-5988957638a2, chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=1374531f-4a2c-49e9-81ba-5cc0c3c1e0af) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.495 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 1374531f-4a2c-49e9-81ba-5cc0c3c1e0af in datapath 7b3bc875-2e1e-4a83-89b3-4fdc39da5699 bound to our chassis
Jan 31 01:45:39 np0005603500 systemd-udevd[217858]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.497 104644 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7b3bc875-2e1e-4a83-89b3-4fdc39da5699
Jan 31 01:45:39 np0005603500 systemd-machined[154375]: New machine qemu-10-instance-0000000a.
Jan 31 01:45:39 np0005603500 ovn_controller[95398]: 2026-01-31T06:45:39Z|00157|binding|INFO|Setting lport 1374531f-4a2c-49e9-81ba-5cc0c3c1e0af ovn-installed in OVS
Jan 31 01:45:39 np0005603500 ovn_controller[95398]: 2026-01-31T06:45:39Z|00158|binding|INFO|Setting lport 1374531f-4a2c-49e9-81ba-5cc0c3c1e0af up in Southbound
Jan 31 01:45:39 np0005603500 NetworkManager[55506]: <info>  [1769841939.5108] device (tap1374531f-4a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:45:39 np0005603500 NetworkManager[55506]: <info>  [1769841939.5118] device (tap1374531f-4a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.512 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[7da20d42-927a-48be-a1b1-722d30d3c6f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.513 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7b3bc875-21 in ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Jan 31 01:45:39 np0005603500 nova_compute[182934]: 2026-01-31 06:45:39.513 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.516 210946 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7b3bc875-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.516 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[f7b29667-56d5-40b3-82f6-ec22255ea647]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.518 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[8de48ad2-83e9-41d6-ae5c-6553c94dc1f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:45:39 np0005603500 systemd[1]: Started Virtual Machine qemu-10-instance-0000000a.
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.530 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[3fc66837-ddb9-4fb8-be88-cd1c35e85732]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.545 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[538e7097-e71a-4988-be6e-2f038ad81fda]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.570 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[213f843b-fd37-4e9a-82a1-39a7899b2723]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:45:39 np0005603500 NetworkManager[55506]: <info>  [1769841939.5783] manager: (tap7b3bc875-20): new Veth device (/org/freedesktop/NetworkManager/Devices/74)
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.578 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[35fb2559-adc6-4d81-9e95-dea04955cda7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.608 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[a625ca99-fd9e-4a09-9626-96fea1399b55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.612 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[7aa680dd-8013-4a88-a608-a0becc69b4e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:45:39 np0005603500 NetworkManager[55506]: <info>  [1769841939.6353] device (tap7b3bc875-20): carrier: link connected
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.641 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[910d1147-2c55-42f6-af98-cc8da5a2531c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.661 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[a33d9c1e-849e-47cf-a83e-7136dcc4b7b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b3bc875-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:89:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425093, 'reachable_time': 26920, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217894, 'error': None, 'target': 'ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.681 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[069f4ee7-efef-40ed-bb54-140387df09e1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe40:8967'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 425093, 'tstamp': 425093}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217899, 'error': None, 'target': 'ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.704 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[0551fef5-e6bc-40c4-a432-4d9a91a99393]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b3bc875-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:89:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425093, 'reachable_time': 26920, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217900, 'error': None, 'target': 'ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.740 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[f5415e61-42eb-4a15-bfff-fb87486fe586]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.791 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[a8b8c287-6bb5-473f-b7ff-f921dc3f0fea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.793 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b3bc875-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.794 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.794 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b3bc875-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:45:39 np0005603500 nova_compute[182934]: 2026-01-31 06:45:39.796 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:39 np0005603500 NetworkManager[55506]: <info>  [1769841939.7968] manager: (tap7b3bc875-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Jan 31 01:45:39 np0005603500 kernel: tap7b3bc875-20: entered promiscuous mode
Jan 31 01:45:39 np0005603500 nova_compute[182934]: 2026-01-31 06:45:39.798 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.800 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7b3bc875-20, col_values=(('external_ids', {'iface-id': 'a47c6cb4-b4f9-42e5-b250-0656effbfd20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:45:39 np0005603500 nova_compute[182934]: 2026-01-31 06:45:39.801 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:39 np0005603500 ovn_controller[95398]: 2026-01-31T06:45:39Z|00159|binding|INFO|Releasing lport a47c6cb4-b4f9-42e5-b250-0656effbfd20 from this chassis (sb_readonly=0)
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.803 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[4b53c240-a60d-4378-ba31-d1ef7fba2df1]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.804 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b3bc875-2e1e-4a83-89b3-4fdc39da5699.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b3bc875-2e1e-4a83-89b3-4fdc39da5699.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.804 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b3bc875-2e1e-4a83-89b3-4fdc39da5699.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b3bc875-2e1e-4a83-89b3-4fdc39da5699.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.804 104644 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 7b3bc875-2e1e-4a83-89b3-4fdc39da5699 disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.804 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b3bc875-2e1e-4a83-89b3-4fdc39da5699.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b3bc875-2e1e-4a83-89b3-4fdc39da5699.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.805 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[a3080ea7-00e7-4eeb-8762-95e02d0540ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.806 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b3bc875-2e1e-4a83-89b3-4fdc39da5699.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b3bc875-2e1e-4a83-89b3-4fdc39da5699.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:45:39 np0005603500 nova_compute[182934]: 2026-01-31 06:45:39.806 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.806 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[74f21e5f-c262-4e89-a66f-b24942ed5cc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.807 104644 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: global
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    log         /dev/log local0 debug
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    log-tag     haproxy-metadata-proxy-7b3bc875-2e1e-4a83-89b3-4fdc39da5699
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    user        root
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    group       root
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    maxconn     1024
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    pidfile     /var/lib/neutron/external/pids/7b3bc875-2e1e-4a83-89b3-4fdc39da5699.pid.haproxy
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    daemon
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: defaults
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    log global
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    mode http
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    option httplog
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    option dontlognull
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    option http-server-close
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    option forwardfor
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    retries                 3
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    timeout http-request    30s
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    timeout connect         30s
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    timeout client          32s
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    timeout server          32s
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    timeout http-keep-alive 30s
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: listen listener
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    bind 169.254.169.254:80
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]:    http-request add-header X-OVN-Network-ID 7b3bc875-2e1e-4a83-89b3-4fdc39da5699
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 31 01:45:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:39.808 104644 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699', 'env', 'PROCESS_TAG=haproxy-7b3bc875-2e1e-4a83-89b3-4fdc39da5699', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7b3bc875-2e1e-4a83-89b3-4fdc39da5699.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Jan 31 01:45:40 np0005603500 nova_compute[182934]: 2026-01-31 06:45:40.031 182938 DEBUG nova.compute.manager [req-a8478572-9bc2-46af-8573-56219fec1947 req-d4d3b90c-fa62-4a07-a42a-f5b57b0b2207 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Received event network-vif-plugged-1374531f-4a2c-49e9-81ba-5cc0c3c1e0af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:45:40 np0005603500 nova_compute[182934]: 2026-01-31 06:45:40.031 182938 DEBUG oslo_concurrency.lockutils [req-a8478572-9bc2-46af-8573-56219fec1947 req-d4d3b90c-fa62-4a07-a42a-f5b57b0b2207 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "c2a731dc-3ec4-4802-af1a-46b70f875be5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:45:40 np0005603500 nova_compute[182934]: 2026-01-31 06:45:40.031 182938 DEBUG oslo_concurrency.lockutils [req-a8478572-9bc2-46af-8573-56219fec1947 req-d4d3b90c-fa62-4a07-a42a-f5b57b0b2207 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "c2a731dc-3ec4-4802-af1a-46b70f875be5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:45:40 np0005603500 nova_compute[182934]: 2026-01-31 06:45:40.032 182938 DEBUG oslo_concurrency.lockutils [req-a8478572-9bc2-46af-8573-56219fec1947 req-d4d3b90c-fa62-4a07-a42a-f5b57b0b2207 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "c2a731dc-3ec4-4802-af1a-46b70f875be5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:45:40 np0005603500 nova_compute[182934]: 2026-01-31 06:45:40.032 182938 DEBUG nova.compute.manager [req-a8478572-9bc2-46af-8573-56219fec1947 req-d4d3b90c-fa62-4a07-a42a-f5b57b0b2207 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Processing event network-vif-plugged-1374531f-4a2c-49e9-81ba-5cc0c3c1e0af _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Jan 31 01:45:40 np0005603500 nova_compute[182934]: 2026-01-31 06:45:40.032 182938 DEBUG nova.compute.manager [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Jan 31 01:45:40 np0005603500 nova_compute[182934]: 2026-01-31 06:45:40.036 182938 DEBUG nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Jan 31 01:45:40 np0005603500 nova_compute[182934]: 2026-01-31 06:45:40.042 182938 INFO nova.virt.libvirt.driver [-] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Instance spawned successfully.
Jan 31 01:45:40 np0005603500 nova_compute[182934]: 2026-01-31 06:45:40.042 182938 DEBUG nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Jan 31 01:45:40 np0005603500 podman[217933]: 2026-01-31 06:45:40.193306396 +0000 UTC m=+0.079277750 container create edb358ed32c98065084e9f5cb9fdcca406a669a8fea433acd4ef560ee6ed65c8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127)
Jan 31 01:45:40 np0005603500 podman[217933]: 2026-01-31 06:45:40.134666503 +0000 UTC m=+0.020637887 image pull d52ce0b189025039ce86fc9564595bcce243e95c598f912f021ea09cd4116a16 quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:45:40 np0005603500 systemd[1]: Started libpod-conmon-edb358ed32c98065084e9f5cb9fdcca406a669a8fea433acd4ef560ee6ed65c8.scope.
Jan 31 01:45:40 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:45:40 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17348ce55d168fbe8f9d0298be134b036a699f8bc84bd1c8a12e62578169f6bb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 01:45:40 np0005603500 podman[217933]: 2026-01-31 06:45:40.315863449 +0000 UTC m=+0.201834823 container init edb358ed32c98065084e9f5cb9fdcca406a669a8fea433acd4ef560ee6ed65c8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:45:40 np0005603500 podman[217933]: 2026-01-31 06:45:40.320979611 +0000 UTC m=+0.206950965 container start edb358ed32c98065084e9f5cb9fdcca406a669a8fea433acd4ef560ee6ed65c8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 31 01:45:40 np0005603500 neutron-haproxy-ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699[217948]: [NOTICE]   (217952) : New worker (217954) forked
Jan 31 01:45:40 np0005603500 neutron-haproxy-ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699[217948]: [NOTICE]   (217952) : Loading success.
Jan 31 01:45:40 np0005603500 nova_compute[182934]: 2026-01-31 06:45:40.525 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Instance c2a731dc-3ec4-4802-af1a-46b70f875be5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Jan 31 01:45:40 np0005603500 nova_compute[182934]: 2026-01-31 06:45:40.525 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:45:40 np0005603500 nova_compute[182934]: 2026-01-31 06:45:40.526 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:45:40 np0005603500 nova_compute[182934]: 2026-01-31 06:45:40.553 182938 DEBUG nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:45:40 np0005603500 nova_compute[182934]: 2026-01-31 06:45:40.553 182938 DEBUG nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:45:40 np0005603500 nova_compute[182934]: 2026-01-31 06:45:40.554 182938 DEBUG nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:45:40 np0005603500 nova_compute[182934]: 2026-01-31 06:45:40.554 182938 DEBUG nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:45:40 np0005603500 nova_compute[182934]: 2026-01-31 06:45:40.555 182938 DEBUG nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:45:40 np0005603500 nova_compute[182934]: 2026-01-31 06:45:40.555 182938 DEBUG nova.virt.libvirt.driver [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:45:40 np0005603500 nova_compute[182934]: 2026-01-31 06:45:40.565 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:45:41 np0005603500 nova_compute[182934]: 2026-01-31 06:45:41.063 182938 INFO nova.compute.manager [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Took 18.96 seconds to spawn the instance on the hypervisor.
Jan 31 01:45:41 np0005603500 nova_compute[182934]: 2026-01-31 06:45:41.064 182938 DEBUG nova.compute.manager [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Jan 31 01:45:41 np0005603500 nova_compute[182934]: 2026-01-31 06:45:41.072 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:45:41 np0005603500 nova_compute[182934]: 2026-01-31 06:45:41.597 182938 INFO nova.compute.manager [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Took 24.51 seconds to build instance.
Jan 31 01:45:41 np0005603500 nova_compute[182934]: 2026-01-31 06:45:41.612 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:45:41 np0005603500 nova_compute[182934]: 2026-01-31 06:45:41.613 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:45:42 np0005603500 nova_compute[182934]: 2026-01-31 06:45:42.145 182938 DEBUG oslo_concurrency.lockutils [None req-480a0c3d-4488-474c-9632-69ae9021f5dc dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "c2a731dc-3ec4-4802-af1a-46b70f875be5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 26.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:45:42 np0005603500 nova_compute[182934]: 2026-01-31 06:45:42.231 182938 DEBUG nova.compute.manager [req-3de7b529-93f1-4b95-a152-c763b2a18480 req-eb40cf6c-3258-4ea3-a9d4-ce1752803631 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Received event network-vif-plugged-1374531f-4a2c-49e9-81ba-5cc0c3c1e0af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:45:42 np0005603500 nova_compute[182934]: 2026-01-31 06:45:42.231 182938 DEBUG oslo_concurrency.lockutils [req-3de7b529-93f1-4b95-a152-c763b2a18480 req-eb40cf6c-3258-4ea3-a9d4-ce1752803631 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "c2a731dc-3ec4-4802-af1a-46b70f875be5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:45:42 np0005603500 nova_compute[182934]: 2026-01-31 06:45:42.232 182938 DEBUG oslo_concurrency.lockutils [req-3de7b529-93f1-4b95-a152-c763b2a18480 req-eb40cf6c-3258-4ea3-a9d4-ce1752803631 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "c2a731dc-3ec4-4802-af1a-46b70f875be5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:45:42 np0005603500 nova_compute[182934]: 2026-01-31 06:45:42.232 182938 DEBUG oslo_concurrency.lockutils [req-3de7b529-93f1-4b95-a152-c763b2a18480 req-eb40cf6c-3258-4ea3-a9d4-ce1752803631 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "c2a731dc-3ec4-4802-af1a-46b70f875be5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:45:42 np0005603500 nova_compute[182934]: 2026-01-31 06:45:42.232 182938 DEBUG nova.compute.manager [req-3de7b529-93f1-4b95-a152-c763b2a18480 req-eb40cf6c-3258-4ea3-a9d4-ce1752803631 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] No waiting events found dispatching network-vif-plugged-1374531f-4a2c-49e9-81ba-5cc0c3c1e0af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:45:42 np0005603500 nova_compute[182934]: 2026-01-31 06:45:42.232 182938 WARNING nova.compute.manager [req-3de7b529-93f1-4b95-a152-c763b2a18480 req-eb40cf6c-3258-4ea3-a9d4-ce1752803631 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Received unexpected event network-vif-plugged-1374531f-4a2c-49e9-81ba-5cc0c3c1e0af for instance with vm_state active and task_state None.
Jan 31 01:45:42 np0005603500 nova_compute[182934]: 2026-01-31 06:45:42.790 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:43 np0005603500 nova_compute[182934]: 2026-01-31 06:45:43.103 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:45:43 np0005603500 podman[217964]: 2026-01-31 06:45:43.138682788 +0000 UTC m=+0.053185191 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, vendor=Red Hat, Inc., maintainer=Red Hat, Inc.)
Jan 31 01:45:43 np0005603500 podman[217963]: 2026-01-31 06:45:43.160933335 +0000 UTC m=+0.076070258 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 31 01:45:43 np0005603500 nova_compute[182934]: 2026-01-31 06:45:43.654 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:45:43 np0005603500 nova_compute[182934]: 2026-01-31 06:45:43.655 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:45:43 np0005603500 nova_compute[182934]: 2026-01-31 06:45:43.655 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:45:43 np0005603500 nova_compute[182934]: 2026-01-31 06:45:43.655 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:45:43 np0005603500 nova_compute[182934]: 2026-01-31 06:45:43.655 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:45:43 np0005603500 nova_compute[182934]: 2026-01-31 06:45:43.655 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:45:43 np0005603500 nova_compute[182934]: 2026-01-31 06:45:43.694 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:45:43 np0005603500 nova_compute[182934]: 2026-01-31 06:45:43.831 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:44 np0005603500 nova_compute[182934]: 2026-01-31 06:45:44.879 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:44 np0005603500 NetworkManager[55506]: <info>  [1769841944.8796] manager: (patch-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Jan 31 01:45:44 np0005603500 NetworkManager[55506]: <info>  [1769841944.8807] manager: (patch-br-int-to-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Jan 31 01:45:44 np0005603500 ovn_controller[95398]: 2026-01-31T06:45:44Z|00160|binding|INFO|Releasing lport a47c6cb4-b4f9-42e5-b250-0656effbfd20 from this chassis (sb_readonly=0)
Jan 31 01:45:44 np0005603500 nova_compute[182934]: 2026-01-31 06:45:44.889 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:44 np0005603500 ovn_controller[95398]: 2026-01-31T06:45:44Z|00161|binding|INFO|Releasing lport a47c6cb4-b4f9-42e5-b250-0656effbfd20 from this chassis (sb_readonly=0)
Jan 31 01:45:44 np0005603500 nova_compute[182934]: 2026-01-31 06:45:44.900 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:45 np0005603500 nova_compute[182934]: 2026-01-31 06:45:45.148 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:45:45 np0005603500 nova_compute[182934]: 2026-01-31 06:45:45.533 182938 DEBUG nova.compute.manager [req-2f0e3d75-a0f0-4d4e-b2a6-62782af964ee req-07a3141a-38ee-420a-8616-6593dfd6f980 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Received event network-changed-1374531f-4a2c-49e9-81ba-5cc0c3c1e0af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:45:45 np0005603500 nova_compute[182934]: 2026-01-31 06:45:45.533 182938 DEBUG nova.compute.manager [req-2f0e3d75-a0f0-4d4e-b2a6-62782af964ee req-07a3141a-38ee-420a-8616-6593dfd6f980 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Refreshing instance network info cache due to event network-changed-1374531f-4a2c-49e9-81ba-5cc0c3c1e0af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:45:45 np0005603500 nova_compute[182934]: 2026-01-31 06:45:45.533 182938 DEBUG oslo_concurrency.lockutils [req-2f0e3d75-a0f0-4d4e-b2a6-62782af964ee req-07a3141a-38ee-420a-8616-6593dfd6f980 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-c2a731dc-3ec4-4802-af1a-46b70f875be5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:45:45 np0005603500 nova_compute[182934]: 2026-01-31 06:45:45.534 182938 DEBUG oslo_concurrency.lockutils [req-2f0e3d75-a0f0-4d4e-b2a6-62782af964ee req-07a3141a-38ee-420a-8616-6593dfd6f980 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-c2a731dc-3ec4-4802-af1a-46b70f875be5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:45:45 np0005603500 nova_compute[182934]: 2026-01-31 06:45:45.534 182938 DEBUG nova.network.neutron [req-2f0e3d75-a0f0-4d4e-b2a6-62782af964ee req-07a3141a-38ee-420a-8616-6593dfd6f980 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Refreshing network info cache for port 1374531f-4a2c-49e9-81ba-5cc0c3c1e0af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:45:47 np0005603500 nova_compute[182934]: 2026-01-31 06:45:47.791 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:48 np0005603500 nova_compute[182934]: 2026-01-31 06:45:48.833 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:49 np0005603500 nova_compute[182934]: 2026-01-31 06:45:49.701 182938 DEBUG nova.network.neutron [req-2f0e3d75-a0f0-4d4e-b2a6-62782af964ee req-07a3141a-38ee-420a-8616-6593dfd6f980 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Updated VIF entry in instance network info cache for port 1374531f-4a2c-49e9-81ba-5cc0c3c1e0af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:45:49 np0005603500 nova_compute[182934]: 2026-01-31 06:45:49.702 182938 DEBUG nova.network.neutron [req-2f0e3d75-a0f0-4d4e-b2a6-62782af964ee req-07a3141a-38ee-420a-8616-6593dfd6f980 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Updating instance_info_cache with network_info: [{"id": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "address": "fa:16:3e:96:58:25", "network": {"id": "7b3bc875-2e1e-4a83-89b3-4fdc39da5699", "bridge": "br-int", "label": "tempest-network-smoke--636953574", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1374531f-4a", "ovs_interfaceid": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:45:50 np0005603500 nova_compute[182934]: 2026-01-31 06:45:50.210 182938 DEBUG oslo_concurrency.lockutils [req-2f0e3d75-a0f0-4d4e-b2a6-62782af964ee req-07a3141a-38ee-420a-8616-6593dfd6f980 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-c2a731dc-3ec4-4802-af1a-46b70f875be5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:45:51 np0005603500 podman[218023]: 2026-01-31 06:45:51.124093971 +0000 UTC m=+0.041838491 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 01:45:51 np0005603500 nova_compute[182934]: 2026-01-31 06:45:51.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:45:51 np0005603500 nova_compute[182934]: 2026-01-31 06:45:51.148 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11834
Jan 31 01:45:51 np0005603500 podman[218024]: 2026-01-31 06:45:51.159346561 +0000 UTC m=+0.072046110 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 31 01:45:51 np0005603500 ovn_controller[95398]: 2026-01-31T06:45:51Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:96:58:25 10.100.0.9
Jan 31 01:45:51 np0005603500 ovn_controller[95398]: 2026-01-31T06:45:51Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:96:58:25 10.100.0.9
Jan 31 01:45:51 np0005603500 nova_compute[182934]: 2026-01-31 06:45:51.655 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11843
Jan 31 01:45:52 np0005603500 nova_compute[182934]: 2026-01-31 06:45:52.793 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:53 np0005603500 nova_compute[182934]: 2026-01-31 06:45:53.835 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:56.220 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:45:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:56.220 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:45:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:45:56.221 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:45:57 np0005603500 nova_compute[182934]: 2026-01-31 06:45:57.794 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:45:57 np0005603500 nova_compute[182934]: 2026-01-31 06:45:57.991 182938 INFO nova.compute.manager [None req-c01a8a93-57da-49d3-9e9a-4db7978063f8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Get console output
Jan 31 01:45:57 np0005603500 nova_compute[182934]: 2026-01-31 06:45:57.997 211654 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 31 01:45:58 np0005603500 nova_compute[182934]: 2026-01-31 06:45:58.836 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:00 np0005603500 ovn_controller[95398]: 2026-01-31T06:46:00Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:96:58:25 10.100.0.9
Jan 31 01:46:01 np0005603500 nova_compute[182934]: 2026-01-31 06:46:01.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:46:01 np0005603500 nova_compute[182934]: 2026-01-31 06:46:01.148 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11872
Jan 31 01:46:02 np0005603500 nova_compute[182934]: 2026-01-31 06:46:02.433 182938 DEBUG nova.compute.manager [req-6ecd6557-25ea-4b14-b7ea-286cfa1a199f req-b1015ff8-188e-434f-9bbe-73afdfd7d89c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Received event network-changed-1374531f-4a2c-49e9-81ba-5cc0c3c1e0af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:46:02 np0005603500 nova_compute[182934]: 2026-01-31 06:46:02.434 182938 DEBUG nova.compute.manager [req-6ecd6557-25ea-4b14-b7ea-286cfa1a199f req-b1015ff8-188e-434f-9bbe-73afdfd7d89c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Refreshing instance network info cache due to event network-changed-1374531f-4a2c-49e9-81ba-5cc0c3c1e0af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:46:02 np0005603500 nova_compute[182934]: 2026-01-31 06:46:02.434 182938 DEBUG oslo_concurrency.lockutils [req-6ecd6557-25ea-4b14-b7ea-286cfa1a199f req-b1015ff8-188e-434f-9bbe-73afdfd7d89c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-c2a731dc-3ec4-4802-af1a-46b70f875be5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:46:02 np0005603500 nova_compute[182934]: 2026-01-31 06:46:02.434 182938 DEBUG oslo_concurrency.lockutils [req-6ecd6557-25ea-4b14-b7ea-286cfa1a199f req-b1015ff8-188e-434f-9bbe-73afdfd7d89c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-c2a731dc-3ec4-4802-af1a-46b70f875be5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:46:02 np0005603500 nova_compute[182934]: 2026-01-31 06:46:02.435 182938 DEBUG nova.network.neutron [req-6ecd6557-25ea-4b14-b7ea-286cfa1a199f req-b1015ff8-188e-434f-9bbe-73afdfd7d89c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Refreshing network info cache for port 1374531f-4a2c-49e9-81ba-5cc0c3c1e0af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:46:02 np0005603500 nova_compute[182934]: 2026-01-31 06:46:02.795 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:02 np0005603500 nova_compute[182934]: 2026-01-31 06:46:02.946 182938 DEBUG oslo_concurrency.lockutils [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "c2a731dc-3ec4-4802-af1a-46b70f875be5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:46:02 np0005603500 nova_compute[182934]: 2026-01-31 06:46:02.947 182938 DEBUG oslo_concurrency.lockutils [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "c2a731dc-3ec4-4802-af1a-46b70f875be5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:46:02 np0005603500 nova_compute[182934]: 2026-01-31 06:46:02.947 182938 DEBUG oslo_concurrency.lockutils [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "c2a731dc-3ec4-4802-af1a-46b70f875be5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:46:02 np0005603500 nova_compute[182934]: 2026-01-31 06:46:02.948 182938 DEBUG oslo_concurrency.lockutils [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "c2a731dc-3ec4-4802-af1a-46b70f875be5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:46:02 np0005603500 nova_compute[182934]: 2026-01-31 06:46:02.948 182938 DEBUG oslo_concurrency.lockutils [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "c2a731dc-3ec4-4802-af1a-46b70f875be5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:46:02 np0005603500 nova_compute[182934]: 2026-01-31 06:46:02.949 182938 INFO nova.compute.manager [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Terminating instance
Jan 31 01:46:03 np0005603500 nova_compute[182934]: 2026-01-31 06:46:03.457 182938 DEBUG nova.compute.manager [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Jan 31 01:46:03 np0005603500 kernel: tap1374531f-4a (unregistering): left promiscuous mode
Jan 31 01:46:03 np0005603500 NetworkManager[55506]: <info>  [1769841963.4872] device (tap1374531f-4a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 01:46:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:46:03Z|00162|binding|INFO|Releasing lport 1374531f-4a2c-49e9-81ba-5cc0c3c1e0af from this chassis (sb_readonly=0)
Jan 31 01:46:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:46:03Z|00163|binding|INFO|Setting lport 1374531f-4a2c-49e9-81ba-5cc0c3c1e0af down in Southbound
Jan 31 01:46:03 np0005603500 nova_compute[182934]: 2026-01-31 06:46:03.492 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:03 np0005603500 ovn_controller[95398]: 2026-01-31T06:46:03Z|00164|binding|INFO|Removing iface tap1374531f-4a ovn-installed in OVS
Jan 31 01:46:03 np0005603500 nova_compute[182934]: 2026-01-31 06:46:03.500 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:03 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:03.501 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:96:58:25 10.100.0.9'], port_security=['fa:16:3e:96:58:25 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'c2a731dc-3ec4-4802-af1a-46b70f875be5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b3bc875-2e1e-4a83-89b3-4fdc39da5699', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '252da2b4-f4b9-42c9-bb68-873b356ebced', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=56b76f4b-66cc-4f3c-b398-5988957638a2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=1374531f-4a2c-49e9-81ba-5cc0c3c1e0af) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:46:03 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:03.502 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 1374531f-4a2c-49e9-81ba-5cc0c3c1e0af in datapath 7b3bc875-2e1e-4a83-89b3-4fdc39da5699 unbound from our chassis
Jan 31 01:46:03 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:03.503 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b3bc875-2e1e-4a83-89b3-4fdc39da5699, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:46:03 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:03.504 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[31a9688b-eca0-476f-b0b2-27190cb0853c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:46:03 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:03.504 104644 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699 namespace which is not needed anymore
Jan 31 01:46:03 np0005603500 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Jan 31 01:46:03 np0005603500 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Consumed 11.999s CPU time.
Jan 31 01:46:03 np0005603500 systemd-machined[154375]: Machine qemu-10-instance-0000000a terminated.
Jan 31 01:46:03 np0005603500 neutron-haproxy-ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699[217948]: [NOTICE]   (217952) : haproxy version is 2.8.14-c23fe91
Jan 31 01:46:03 np0005603500 neutron-haproxy-ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699[217948]: [NOTICE]   (217952) : path to executable is /usr/sbin/haproxy
Jan 31 01:46:03 np0005603500 neutron-haproxy-ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699[217948]: [WARNING]  (217952) : Exiting Master process...
Jan 31 01:46:03 np0005603500 podman[218092]: 2026-01-31 06:46:03.598662173 +0000 UTC m=+0.024898202 container kill edb358ed32c98065084e9f5cb9fdcca406a669a8fea433acd4ef560ee6ed65c8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 01:46:03 np0005603500 neutron-haproxy-ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699[217948]: [ALERT]    (217952) : Current worker (217954) exited with code 143 (Terminated)
Jan 31 01:46:03 np0005603500 neutron-haproxy-ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699[217948]: [WARNING]  (217952) : All workers exited. Exiting... (0)
Jan 31 01:46:03 np0005603500 systemd[1]: libpod-edb358ed32c98065084e9f5cb9fdcca406a669a8fea433acd4ef560ee6ed65c8.scope: Deactivated successfully.
Jan 31 01:46:03 np0005603500 podman[218108]: 2026-01-31 06:46:03.632917071 +0000 UTC m=+0.019585813 container died edb358ed32c98065084e9f5cb9fdcca406a669a8fea433acd4ef560ee6ed65c8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 31 01:46:03 np0005603500 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-edb358ed32c98065084e9f5cb9fdcca406a669a8fea433acd4ef560ee6ed65c8-userdata-shm.mount: Deactivated successfully.
Jan 31 01:46:03 np0005603500 systemd[1]: var-lib-containers-storage-overlay-17348ce55d168fbe8f9d0298be134b036a699f8bc84bd1c8a12e62578169f6bb-merged.mount: Deactivated successfully.
Jan 31 01:46:03 np0005603500 podman[218108]: 2026-01-31 06:46:03.656626035 +0000 UTC m=+0.043294747 container cleanup edb358ed32c98065084e9f5cb9fdcca406a669a8fea433acd4ef560ee6ed65c8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:46:03 np0005603500 systemd[1]: libpod-conmon-edb358ed32c98065084e9f5cb9fdcca406a669a8fea433acd4ef560ee6ed65c8.scope: Deactivated successfully.
Jan 31 01:46:03 np0005603500 podman[218109]: 2026-01-31 06:46:03.669885076 +0000 UTC m=+0.053452190 container remove edb358ed32c98065084e9f5cb9fdcca406a669a8fea433acd4ef560ee6ed65c8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 01:46:03 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:03.700 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[be5c0080-eb8b-472f-8707-6f077be3988e]: (4, ("Sat Jan 31 06:46:03 AM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699 (edb358ed32c98065084e9f5cb9fdcca406a669a8fea433acd4ef560ee6ed65c8)\nedb358ed32c98065084e9f5cb9fdcca406a669a8fea433acd4ef560ee6ed65c8\nSat Jan 31 06:46:03 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699 (edb358ed32c98065084e9f5cb9fdcca406a669a8fea433acd4ef560ee6ed65c8)\nedb358ed32c98065084e9f5cb9fdcca406a669a8fea433acd4ef560ee6ed65c8\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:46:03 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:03.701 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[78413cb0-9c18-4081-9e1b-d5b1676432c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:46:03 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:03.701 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b3bc875-2e1e-4a83-89b3-4fdc39da5699.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b3bc875-2e1e-4a83-89b3-4fdc39da5699.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:46:03 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:03.702 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[1c59c332-9a03-42c3-bb4a-1b80c3b64ea0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:46:03 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:03.702 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b3bc875-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:46:03 np0005603500 nova_compute[182934]: 2026-01-31 06:46:03.702 182938 INFO nova.virt.libvirt.driver [-] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Instance destroyed successfully.
Jan 31 01:46:03 np0005603500 nova_compute[182934]: 2026-01-31 06:46:03.703 182938 DEBUG nova.objects.instance [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'resources' on Instance uuid c2a731dc-3ec4-4802-af1a-46b70f875be5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:46:03 np0005603500 nova_compute[182934]: 2026-01-31 06:46:03.704 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:03 np0005603500 kernel: tap7b3bc875-20: left promiscuous mode
Jan 31 01:46:03 np0005603500 nova_compute[182934]: 2026-01-31 06:46:03.710 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:03 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:03.713 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[4255c9a5-ae8b-42a9-9751-6f08411eab9e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:46:03 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:03.729 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[e901d0e1-d69d-4eaa-b7b7-b56ef5663c82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:46:03 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:03.731 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[ba87d296-2b62-469b-bbe8-30874daad1e5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:46:03 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:03.742 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[5c804315-5a65-4fbb-8ca2-12cc19b512a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425086, 'reachable_time': 26905, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218155, 'error': None, 'target': 'ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:46:03 np0005603500 systemd[1]: run-netns-ovnmeta\x2d7b3bc875\x2d2e1e\x2d4a83\x2d89b3\x2d4fdc39da5699.mount: Deactivated successfully.
Jan 31 01:46:03 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:03.744 105168 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7b3bc875-2e1e-4a83-89b3-4fdc39da5699 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 31 01:46:03 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:03.745 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[d477966d-6a69-4b8c-9e18-00569ad05517]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:46:03 np0005603500 nova_compute[182934]: 2026-01-31 06:46:03.837 182938 DEBUG nova.compute.manager [req-da0e065b-1d6b-4875-a681-f676c61aff32 req-40b8fe79-43e3-4e2b-9a60-b16cb28f505a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Received event network-vif-unplugged-1374531f-4a2c-49e9-81ba-5cc0c3c1e0af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:46:03 np0005603500 nova_compute[182934]: 2026-01-31 06:46:03.837 182938 DEBUG oslo_concurrency.lockutils [req-da0e065b-1d6b-4875-a681-f676c61aff32 req-40b8fe79-43e3-4e2b-9a60-b16cb28f505a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "c2a731dc-3ec4-4802-af1a-46b70f875be5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:46:03 np0005603500 nova_compute[182934]: 2026-01-31 06:46:03.840 182938 DEBUG oslo_concurrency.lockutils [req-da0e065b-1d6b-4875-a681-f676c61aff32 req-40b8fe79-43e3-4e2b-9a60-b16cb28f505a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "c2a731dc-3ec4-4802-af1a-46b70f875be5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:46:03 np0005603500 nova_compute[182934]: 2026-01-31 06:46:03.840 182938 DEBUG oslo_concurrency.lockutils [req-da0e065b-1d6b-4875-a681-f676c61aff32 req-40b8fe79-43e3-4e2b-9a60-b16cb28f505a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "c2a731dc-3ec4-4802-af1a-46b70f875be5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:46:03 np0005603500 nova_compute[182934]: 2026-01-31 06:46:03.840 182938 DEBUG nova.compute.manager [req-da0e065b-1d6b-4875-a681-f676c61aff32 req-40b8fe79-43e3-4e2b-9a60-b16cb28f505a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] No waiting events found dispatching network-vif-unplugged-1374531f-4a2c-49e9-81ba-5cc0c3c1e0af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:46:03 np0005603500 nova_compute[182934]: 2026-01-31 06:46:03.840 182938 DEBUG nova.compute.manager [req-da0e065b-1d6b-4875-a681-f676c61aff32 req-40b8fe79-43e3-4e2b-9a60-b16cb28f505a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Received event network-vif-unplugged-1374531f-4a2c-49e9-81ba-5cc0c3c1e0af for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Jan 31 01:46:03 np0005603500 nova_compute[182934]: 2026-01-31 06:46:03.841 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:04 np0005603500 nova_compute[182934]: 2026-01-31 06:46:04.214 182938 DEBUG nova.virt.libvirt.vif [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:45:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1168517125',display_name='tempest-TestNetworkBasicOps-server-1168517125',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1168517125',id=10,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOFug/HTxsw4qAuOUyOVy7vltes/7jhsn7DdsGV7ihx5cjSPat4sI9CRbIVsiWEliMjZGBroUzDkAxbhSZJ+tDVF7b3joz0KfL7WOCtxFgAsUECyZ4vTDFkW3SYt6qm+nw==',key_name='tempest-TestNetworkBasicOps-1261257867',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:45:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-nqlio06b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:45:41Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=c2a731dc-3ec4-4802-af1a-46b70f875be5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "address": "fa:16:3e:96:58:25", "network": {"id": "7b3bc875-2e1e-4a83-89b3-4fdc39da5699", "bridge": "br-int", "label": "tempest-network-smoke--636953574", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1374531f-4a", "ovs_interfaceid": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Jan 31 01:46:04 np0005603500 nova_compute[182934]: 2026-01-31 06:46:04.215 182938 DEBUG nova.network.os_vif_util [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "address": "fa:16:3e:96:58:25", "network": {"id": "7b3bc875-2e1e-4a83-89b3-4fdc39da5699", "bridge": "br-int", "label": "tempest-network-smoke--636953574", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1374531f-4a", "ovs_interfaceid": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:46:04 np0005603500 nova_compute[182934]: 2026-01-31 06:46:04.216 182938 DEBUG nova.network.os_vif_util [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:96:58:25,bridge_name='br-int',has_traffic_filtering=True,id=1374531f-4a2c-49e9-81ba-5cc0c3c1e0af,network=Network(7b3bc875-2e1e-4a83-89b3-4fdc39da5699),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1374531f-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:46:04 np0005603500 nova_compute[182934]: 2026-01-31 06:46:04.216 182938 DEBUG os_vif [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:96:58:25,bridge_name='br-int',has_traffic_filtering=True,id=1374531f-4a2c-49e9-81ba-5cc0c3c1e0af,network=Network(7b3bc875-2e1e-4a83-89b3-4fdc39da5699),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1374531f-4a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 31 01:46:04 np0005603500 nova_compute[182934]: 2026-01-31 06:46:04.219 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:04 np0005603500 nova_compute[182934]: 2026-01-31 06:46:04.219 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1374531f-4a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:46:04 np0005603500 nova_compute[182934]: 2026-01-31 06:46:04.221 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:04 np0005603500 nova_compute[182934]: 2026-01-31 06:46:04.221 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:04 np0005603500 nova_compute[182934]: 2026-01-31 06:46:04.222 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:04 np0005603500 nova_compute[182934]: 2026-01-31 06:46:04.222 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=aa013439-aca3-455a-af6a-c13a4b480787) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:46:04 np0005603500 nova_compute[182934]: 2026-01-31 06:46:04.223 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:04 np0005603500 nova_compute[182934]: 2026-01-31 06:46:04.223 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:04 np0005603500 nova_compute[182934]: 2026-01-31 06:46:04.226 182938 INFO os_vif [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:96:58:25,bridge_name='br-int',has_traffic_filtering=True,id=1374531f-4a2c-49e9-81ba-5cc0c3c1e0af,network=Network(7b3bc875-2e1e-4a83-89b3-4fdc39da5699),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1374531f-4a')
Jan 31 01:46:04 np0005603500 nova_compute[182934]: 2026-01-31 06:46:04.227 182938 INFO nova.virt.libvirt.driver [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Deleting instance files /var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5_del
Jan 31 01:46:04 np0005603500 nova_compute[182934]: 2026-01-31 06:46:04.227 182938 INFO nova.virt.libvirt.driver [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Deletion of /var/lib/nova/instances/c2a731dc-3ec4-4802-af1a-46b70f875be5_del complete
Jan 31 01:46:04 np0005603500 nova_compute[182934]: 2026-01-31 06:46:04.740 182938 INFO nova.compute.manager [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Took 1.28 seconds to destroy the instance on the hypervisor.
Jan 31 01:46:04 np0005603500 nova_compute[182934]: 2026-01-31 06:46:04.740 182938 DEBUG oslo.service.backend.eventlet.loopingcall [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Jan 31 01:46:04 np0005603500 nova_compute[182934]: 2026-01-31 06:46:04.740 182938 DEBUG nova.compute.manager [-] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Jan 31 01:46:04 np0005603500 nova_compute[182934]: 2026-01-31 06:46:04.740 182938 DEBUG nova.network.neutron [-] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Jan 31 01:46:05 np0005603500 podman[218157]: 2026-01-31 06:46:05.12832493 +0000 UTC m=+0.040079335 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 31 01:46:05 np0005603500 podman[218156]: 2026-01-31 06:46:05.128493675 +0000 UTC m=+0.042363917 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:46:06 np0005603500 nova_compute[182934]: 2026-01-31 06:46:06.110 182938 DEBUG nova.compute.manager [req-1b9afe38-f575-4790-81b2-90a4d58c8c21 req-e4593183-9ee3-4c94-853e-8238a8640035 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Received event network-vif-plugged-1374531f-4a2c-49e9-81ba-5cc0c3c1e0af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:46:06 np0005603500 nova_compute[182934]: 2026-01-31 06:46:06.110 182938 DEBUG oslo_concurrency.lockutils [req-1b9afe38-f575-4790-81b2-90a4d58c8c21 req-e4593183-9ee3-4c94-853e-8238a8640035 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "c2a731dc-3ec4-4802-af1a-46b70f875be5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:46:06 np0005603500 nova_compute[182934]: 2026-01-31 06:46:06.111 182938 DEBUG oslo_concurrency.lockutils [req-1b9afe38-f575-4790-81b2-90a4d58c8c21 req-e4593183-9ee3-4c94-853e-8238a8640035 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "c2a731dc-3ec4-4802-af1a-46b70f875be5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:46:06 np0005603500 nova_compute[182934]: 2026-01-31 06:46:06.111 182938 DEBUG oslo_concurrency.lockutils [req-1b9afe38-f575-4790-81b2-90a4d58c8c21 req-e4593183-9ee3-4c94-853e-8238a8640035 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "c2a731dc-3ec4-4802-af1a-46b70f875be5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:46:06 np0005603500 nova_compute[182934]: 2026-01-31 06:46:06.111 182938 DEBUG nova.compute.manager [req-1b9afe38-f575-4790-81b2-90a4d58c8c21 req-e4593183-9ee3-4c94-853e-8238a8640035 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] No waiting events found dispatching network-vif-plugged-1374531f-4a2c-49e9-81ba-5cc0c3c1e0af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:46:06 np0005603500 nova_compute[182934]: 2026-01-31 06:46:06.111 182938 WARNING nova.compute.manager [req-1b9afe38-f575-4790-81b2-90a4d58c8c21 req-e4593183-9ee3-4c94-853e-8238a8640035 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Received unexpected event network-vif-plugged-1374531f-4a2c-49e9-81ba-5cc0c3c1e0af for instance with vm_state active and task_state deleting.
Jan 31 01:46:06 np0005603500 nova_compute[182934]: 2026-01-31 06:46:06.140 182938 DEBUG nova.network.neutron [req-6ecd6557-25ea-4b14-b7ea-286cfa1a199f req-b1015ff8-188e-434f-9bbe-73afdfd7d89c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Updated VIF entry in instance network info cache for port 1374531f-4a2c-49e9-81ba-5cc0c3c1e0af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:46:06 np0005603500 nova_compute[182934]: 2026-01-31 06:46:06.141 182938 DEBUG nova.network.neutron [req-6ecd6557-25ea-4b14-b7ea-286cfa1a199f req-b1015ff8-188e-434f-9bbe-73afdfd7d89c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Updating instance_info_cache with network_info: [{"id": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "address": "fa:16:3e:96:58:25", "network": {"id": "7b3bc875-2e1e-4a83-89b3-4fdc39da5699", "bridge": "br-int", "label": "tempest-network-smoke--636953574", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1374531f-4a", "ovs_interfaceid": "1374531f-4a2c-49e9-81ba-5cc0c3c1e0af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:46:06 np0005603500 nova_compute[182934]: 2026-01-31 06:46:06.532 182938 DEBUG nova.network.neutron [-] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:46:06 np0005603500 nova_compute[182934]: 2026-01-31 06:46:06.646 182938 DEBUG oslo_concurrency.lockutils [req-6ecd6557-25ea-4b14-b7ea-286cfa1a199f req-b1015ff8-188e-434f-9bbe-73afdfd7d89c 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-c2a731dc-3ec4-4802-af1a-46b70f875be5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:46:07 np0005603500 nova_compute[182934]: 2026-01-31 06:46:07.040 182938 INFO nova.compute.manager [-] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Took 2.30 seconds to deallocate network for instance.
Jan 31 01:46:07 np0005603500 nova_compute[182934]: 2026-01-31 06:46:07.549 182938 DEBUG oslo_concurrency.lockutils [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:46:07 np0005603500 nova_compute[182934]: 2026-01-31 06:46:07.549 182938 DEBUG oslo_concurrency.lockutils [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:46:07 np0005603500 nova_compute[182934]: 2026-01-31 06:46:07.609 182938 DEBUG nova.compute.provider_tree [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:46:08 np0005603500 nova_compute[182934]: 2026-01-31 06:46:08.117 182938 DEBUG nova.scheduler.client.report [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:46:08 np0005603500 nova_compute[182934]: 2026-01-31 06:46:08.333 182938 DEBUG nova.compute.manager [req-559806b0-ab3f-46d3-99c4-756d83053ec7 req-628120a7-caa3-47dc-804f-6780041ff07a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: c2a731dc-3ec4-4802-af1a-46b70f875be5] Received event network-vif-deleted-1374531f-4a2c-49e9-81ba-5cc0c3c1e0af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:46:08 np0005603500 nova_compute[182934]: 2026-01-31 06:46:08.628 182938 DEBUG oslo_concurrency.lockutils [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:46:08 np0005603500 nova_compute[182934]: 2026-01-31 06:46:08.657 182938 INFO nova.scheduler.client.report [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Deleted allocations for instance c2a731dc-3ec4-4802-af1a-46b70f875be5
Jan 31 01:46:08 np0005603500 nova_compute[182934]: 2026-01-31 06:46:08.839 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:09 np0005603500 nova_compute[182934]: 2026-01-31 06:46:09.223 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:09 np0005603500 nova_compute[182934]: 2026-01-31 06:46:09.677 182938 DEBUG oslo_concurrency.lockutils [None req-8f14cfdb-4afc-43a3-8682-560cffe6c4b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "c2a731dc-3ec4-4802-af1a-46b70f875be5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.730s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:46:13 np0005603500 nova_compute[182934]: 2026-01-31 06:46:13.358 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:13 np0005603500 nova_compute[182934]: 2026-01-31 06:46:13.382 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:13 np0005603500 nova_compute[182934]: 2026-01-31 06:46:13.840 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:14 np0005603500 podman[218197]: 2026-01-31 06:46:14.153209326 +0000 UTC m=+0.069829490 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:46:14 np0005603500 podman[218198]: 2026-01-31 06:46:14.16653762 +0000 UTC m=+0.077652229 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, vendor=Red Hat, Inc., config_id=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=)
Jan 31 01:46:14 np0005603500 nova_compute[182934]: 2026-01-31 06:46:14.225 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.984 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f199f44dcd0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f19a53f3b80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f199f44d040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.986 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f199f44d160>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f199f43bbe0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f199f44d6a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f199f436bb0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f199f44d3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f199f44d220>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f199f44d4f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f199f44d760>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f199f43b700>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f199f43bdf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f199f43baf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f199f43b0d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f199f44dc10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f199f44d940>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f199f44d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f199f44d2e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f199f43b3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f199f43bca0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f199f44db50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f199f43b550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f199f451250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f199f43b490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.992 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f199f43b340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:46:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:46:17.992 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:46:18 np0005603500 nova_compute[182934]: 2026-01-31 06:46:18.842 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:19 np0005603500 nova_compute[182934]: 2026-01-31 06:46:19.228 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:22 np0005603500 podman[218245]: 2026-01-31 06:46:22.130340816 +0000 UTC m=+0.049036469 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 31 01:46:22 np0005603500 podman[218244]: 2026-01-31 06:46:22.138711312 +0000 UTC m=+0.059787901 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 01:46:23 np0005603500 nova_compute[182934]: 2026-01-31 06:46:23.844 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:24 np0005603500 nova_compute[182934]: 2026-01-31 06:46:24.231 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:27.650 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:54:b8 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1eee62de-b98b-4359-b27b-63fb0219f31c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=055d2587-07d8-48d2-bdce-e8f3e4584c68, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=060d8bd6-d243-4a7c-b3cb-d0f6dfd68585) old=Port_Binding(mac=['fa:16:3e:c4:54:b8'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1eee62de-b98b-4359-b27b-63fb0219f31c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:46:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:27.651 104644 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 060d8bd6-d243-4a7c-b3cb-d0f6dfd68585 in datapath 1eee62de-b98b-4359-b27b-63fb0219f31c updated
Jan 31 01:46:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:27.653 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1eee62de-b98b-4359-b27b-63fb0219f31c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:46:27 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:27.654 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[6ba70944-378a-47e6-ac9f-71bfa64b0c92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:46:28 np0005603500 nova_compute[182934]: 2026-01-31 06:46:28.866 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:29 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:29.205 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:46:29 np0005603500 nova_compute[182934]: 2026-01-31 06:46:29.206 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:29 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:29.207 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:46:29 np0005603500 nova_compute[182934]: 2026-01-31 06:46:29.232 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:33 np0005603500 nova_compute[182934]: 2026-01-31 06:46:33.869 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:34 np0005603500 nova_compute[182934]: 2026-01-31 06:46:34.234 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:35 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:35.208 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:46:36 np0005603500 podman[218288]: 2026-01-31 06:46:36.125104384 +0000 UTC m=+0.048029337 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:46:36 np0005603500 podman[218289]: 2026-01-31 06:46:36.12652674 +0000 UTC m=+0.047154010 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 01:46:37 np0005603500 nova_compute[182934]: 2026-01-31 06:46:37.655 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:46:38 np0005603500 nova_compute[182934]: 2026-01-31 06:46:38.288 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:46:38 np0005603500 nova_compute[182934]: 2026-01-31 06:46:38.288 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:46:38 np0005603500 nova_compute[182934]: 2026-01-31 06:46:38.288 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:46:38 np0005603500 nova_compute[182934]: 2026-01-31 06:46:38.289 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:46:38 np0005603500 nova_compute[182934]: 2026-01-31 06:46:38.418 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:46:38 np0005603500 nova_compute[182934]: 2026-01-31 06:46:38.419 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5771MB free_disk=73.2117805480957GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:46:38 np0005603500 nova_compute[182934]: 2026-01-31 06:46:38.419 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:46:38 np0005603500 nova_compute[182934]: 2026-01-31 06:46:38.419 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:46:38 np0005603500 nova_compute[182934]: 2026-01-31 06:46:38.871 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:39 np0005603500 nova_compute[182934]: 2026-01-31 06:46:39.237 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:40 np0005603500 nova_compute[182934]: 2026-01-31 06:46:40.629 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:46:40 np0005603500 nova_compute[182934]: 2026-01-31 06:46:40.629 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:46:40 np0005603500 nova_compute[182934]: 2026-01-31 06:46:40.875 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Refreshing inventories for resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:822
Jan 31 01:46:41 np0005603500 nova_compute[182934]: 2026-01-31 06:46:41.095 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Updating ProviderTree inventory for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:786
Jan 31 01:46:41 np0005603500 nova_compute[182934]: 2026-01-31 06:46:41.095 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Updating inventory in ProviderTree for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 31 01:46:41 np0005603500 nova_compute[182934]: 2026-01-31 06:46:41.111 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Refreshing aggregate associations for resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:831
Jan 31 01:46:41 np0005603500 nova_compute[182934]: 2026-01-31 06:46:41.131 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Refreshing trait associations for resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59, traits: COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,HW_CPU_X86_BMI2,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,HW_ARCH_X86_64,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_ABM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_CRB,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_SHA,HW_CPU_X86_CLMUL,HW_CPU_X86_AVX,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_TIS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_FMA3,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SVM,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:843
Jan 31 01:46:41 np0005603500 nova_compute[182934]: 2026-01-31 06:46:41.159 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:46:41 np0005603500 nova_compute[182934]: 2026-01-31 06:46:41.678 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:46:42 np0005603500 nova_compute[182934]: 2026-01-31 06:46:42.220 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:46:42 np0005603500 nova_compute[182934]: 2026-01-31 06:46:42.221 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.801s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:46:43 np0005603500 nova_compute[182934]: 2026-01-31 06:46:43.713 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:46:43 np0005603500 nova_compute[182934]: 2026-01-31 06:46:43.714 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:46:43 np0005603500 nova_compute[182934]: 2026-01-31 06:46:43.714 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:46:43 np0005603500 nova_compute[182934]: 2026-01-31 06:46:43.714 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:46:43 np0005603500 nova_compute[182934]: 2026-01-31 06:46:43.714 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:46:43 np0005603500 nova_compute[182934]: 2026-01-31 06:46:43.715 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:46:43 np0005603500 nova_compute[182934]: 2026-01-31 06:46:43.874 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:44 np0005603500 nova_compute[182934]: 2026-01-31 06:46:44.239 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:45 np0005603500 podman[218332]: 2026-01-31 06:46:45.123778879 +0000 UTC m=+0.042846712 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, version=9.7, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 31 01:46:45 np0005603500 nova_compute[182934]: 2026-01-31 06:46:45.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:46:45 np0005603500 nova_compute[182934]: 2026-01-31 06:46:45.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:46:45 np0005603500 podman[218331]: 2026-01-31 06:46:45.177422784 +0000 UTC m=+0.097527250 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3)
Jan 31 01:46:47 np0005603500 nova_compute[182934]: 2026-01-31 06:46:47.673 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:46:48 np0005603500 nova_compute[182934]: 2026-01-31 06:46:48.875 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:49 np0005603500 nova_compute[182934]: 2026-01-31 06:46:49.242 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:53 np0005603500 podman[218378]: 2026-01-31 06:46:53.128480693 +0000 UTC m=+0.050146664 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:46:53 np0005603500 podman[218379]: 2026-01-31 06:46:53.133214904 +0000 UTC m=+0.050108163 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:46:53 np0005603500 nova_compute[182934]: 2026-01-31 06:46:53.539 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "5b239873-eca6-4fdc-b15d-2801c75cafa9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:46:53 np0005603500 nova_compute[182934]: 2026-01-31 06:46:53.539 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:46:53 np0005603500 nova_compute[182934]: 2026-01-31 06:46:53.877 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:54 np0005603500 nova_compute[182934]: 2026-01-31 06:46:54.110 182938 DEBUG nova.compute.manager [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Jan 31 01:46:54 np0005603500 nova_compute[182934]: 2026-01-31 06:46:54.244 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:54 np0005603500 nova_compute[182934]: 2026-01-31 06:46:54.681 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:46:54 np0005603500 nova_compute[182934]: 2026-01-31 06:46:54.682 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:46:54 np0005603500 nova_compute[182934]: 2026-01-31 06:46:54.709 182938 DEBUG nova.virt.hardware [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Jan 31 01:46:54 np0005603500 nova_compute[182934]: 2026-01-31 06:46:54.710 182938 INFO nova.compute.claims [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Claim successful on node compute-0.ctlplane.example.com
Jan 31 01:46:55 np0005603500 nova_compute[182934]: 2026-01-31 06:46:55.821 182938 DEBUG nova.compute.provider_tree [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:46:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:56.281 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:46:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:56.282 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:46:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:46:56.282 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:46:56 np0005603500 nova_compute[182934]: 2026-01-31 06:46:56.331 182938 DEBUG nova.scheduler.client.report [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:46:56 np0005603500 nova_compute[182934]: 2026-01-31 06:46:56.840 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:46:56 np0005603500 nova_compute[182934]: 2026-01-31 06:46:56.841 182938 DEBUG nova.compute.manager [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Jan 31 01:46:57 np0005603500 nova_compute[182934]: 2026-01-31 06:46:57.358 182938 DEBUG nova.compute.manager [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Jan 31 01:46:57 np0005603500 nova_compute[182934]: 2026-01-31 06:46:57.359 182938 DEBUG nova.network.neutron [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Jan 31 01:46:57 np0005603500 nova_compute[182934]: 2026-01-31 06:46:57.873 182938 INFO nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 31 01:46:58 np0005603500 nova_compute[182934]: 2026-01-31 06:46:58.034 182938 DEBUG nova.policy [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '829310cd8381494e96216dba067ff8d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Jan 31 01:46:58 np0005603500 nova_compute[182934]: 2026-01-31 06:46:58.383 182938 DEBUG nova.compute.manager [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Jan 31 01:46:58 np0005603500 nova_compute[182934]: 2026-01-31 06:46:58.879 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.245 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:46:59 np0005603500 ovn_controller[95398]: 2026-01-31T06:46:59Z|00165|memory_trim|INFO|Detected inactivity (last active 30015 ms ago): trimming memory
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.405 182938 DEBUG nova.compute.manager [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.407 182938 DEBUG nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.407 182938 INFO nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Creating image(s)
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.408 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "/var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.408 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.409 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.409 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.412 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.414 182938 DEBUG oslo_concurrency.processutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.463 182938 DEBUG oslo_concurrency.processutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.464 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "d9035e96dc857b84194c2a2b496d294827e2de39" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.464 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.465 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.468 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.469 182938 DEBUG oslo_concurrency.processutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.520 182938 DEBUG oslo_concurrency.processutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.522 182938 DEBUG oslo_concurrency.processutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.557 182938 DEBUG oslo_concurrency.processutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.558 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.559 182938 DEBUG oslo_concurrency.processutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.604 182938 DEBUG oslo_concurrency.processutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.605 182938 DEBUG nova.virt.disk.api [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Checking if we can resize image /var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.605 182938 DEBUG oslo_concurrency.processutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.626 182938 DEBUG nova.network.neutron [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Successfully created port: 3c1979fb-e961-4676-b6c2-2ca71f2d859b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.658 182938 DEBUG oslo_concurrency.processutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.659 182938 DEBUG nova.virt.disk.api [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Cannot resize image /var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.659 182938 DEBUG nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.659 182938 DEBUG nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Ensure instance console log exists: /var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.660 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.660 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:46:59 np0005603500 nova_compute[182934]: 2026-01-31 06:46:59.660 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:47:01 np0005603500 nova_compute[182934]: 2026-01-31 06:47:01.035 182938 DEBUG nova.network.neutron [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Successfully updated port: 3c1979fb-e961-4676-b6c2-2ca71f2d859b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 31 01:47:01 np0005603500 nova_compute[182934]: 2026-01-31 06:47:01.280 182938 DEBUG nova.compute.manager [req-c0a00ef2-2a06-46da-922d-f5e3e3b6a9c6 req-f8adbd28-d4f4-4f9a-9a48-df755402d24a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received event network-changed-3c1979fb-e961-4676-b6c2-2ca71f2d859b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:47:01 np0005603500 nova_compute[182934]: 2026-01-31 06:47:01.281 182938 DEBUG nova.compute.manager [req-c0a00ef2-2a06-46da-922d-f5e3e3b6a9c6 req-f8adbd28-d4f4-4f9a-9a48-df755402d24a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Refreshing instance network info cache due to event network-changed-3c1979fb-e961-4676-b6c2-2ca71f2d859b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:47:01 np0005603500 nova_compute[182934]: 2026-01-31 06:47:01.281 182938 DEBUG oslo_concurrency.lockutils [req-c0a00ef2-2a06-46da-922d-f5e3e3b6a9c6 req-f8adbd28-d4f4-4f9a-9a48-df755402d24a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-5b239873-eca6-4fdc-b15d-2801c75cafa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:47:01 np0005603500 nova_compute[182934]: 2026-01-31 06:47:01.281 182938 DEBUG oslo_concurrency.lockutils [req-c0a00ef2-2a06-46da-922d-f5e3e3b6a9c6 req-f8adbd28-d4f4-4f9a-9a48-df755402d24a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-5b239873-eca6-4fdc-b15d-2801c75cafa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:47:01 np0005603500 nova_compute[182934]: 2026-01-31 06:47:01.281 182938 DEBUG nova.network.neutron [req-c0a00ef2-2a06-46da-922d-f5e3e3b6a9c6 req-f8adbd28-d4f4-4f9a-9a48-df755402d24a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Refreshing network info cache for port 3c1979fb-e961-4676-b6c2-2ca71f2d859b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:47:01 np0005603500 nova_compute[182934]: 2026-01-31 06:47:01.556 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "refresh_cache-5b239873-eca6-4fdc-b15d-2801c75cafa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:47:03 np0005603500 nova_compute[182934]: 2026-01-31 06:47:03.016 182938 DEBUG nova.network.neutron [req-c0a00ef2-2a06-46da-922d-f5e3e3b6a9c6 req-f8adbd28-d4f4-4f9a-9a48-df755402d24a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:47:03 np0005603500 nova_compute[182934]: 2026-01-31 06:47:03.880 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:04 np0005603500 nova_compute[182934]: 2026-01-31 06:47:04.248 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:04 np0005603500 nova_compute[182934]: 2026-01-31 06:47:04.253 182938 DEBUG nova.network.neutron [req-c0a00ef2-2a06-46da-922d-f5e3e3b6a9c6 req-f8adbd28-d4f4-4f9a-9a48-df755402d24a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:47:04 np0005603500 nova_compute[182934]: 2026-01-31 06:47:04.768 182938 DEBUG oslo_concurrency.lockutils [req-c0a00ef2-2a06-46da-922d-f5e3e3b6a9c6 req-f8adbd28-d4f4-4f9a-9a48-df755402d24a 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-5b239873-eca6-4fdc-b15d-2801c75cafa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:47:04 np0005603500 nova_compute[182934]: 2026-01-31 06:47:04.769 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquired lock "refresh_cache-5b239873-eca6-4fdc-b15d-2801c75cafa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:47:04 np0005603500 nova_compute[182934]: 2026-01-31 06:47:04.769 182938 DEBUG nova.network.neutron [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Jan 31 01:47:06 np0005603500 nova_compute[182934]: 2026-01-31 06:47:06.036 182938 DEBUG nova.network.neutron [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:47:07 np0005603500 podman[218433]: 2026-01-31 06:47:07.153875716 +0000 UTC m=+0.070719578 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 31 01:47:07 np0005603500 podman[218434]: 2026-01-31 06:47:07.167836309 +0000 UTC m=+0.077369720 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.056 182938 DEBUG nova.network.neutron [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Updating instance_info_cache with network_info: [{"id": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "address": "fa:16:3e:a7:e9:1e", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c1979fb-e9", "ovs_interfaceid": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.564 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Releasing lock "refresh_cache-5b239873-eca6-4fdc-b15d-2801c75cafa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.564 182938 DEBUG nova.compute.manager [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Instance network_info: |[{"id": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "address": "fa:16:3e:a7:e9:1e", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c1979fb-e9", "ovs_interfaceid": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.567 182938 DEBUG nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Start _get_guest_xml network_info=[{"id": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "address": "fa:16:3e:a7:e9:1e", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c1979fb-e9", "ovs_interfaceid": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': '9f613975-b701-42a0-9b35-7d5c4a2cb7f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.570 182938 WARNING nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.571 182938 DEBUG nova.virt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-794782075', uuid='5b239873-eca6-4fdc-b15d-2801c75cafa9'), owner=OwnerMeta(userid='dddc34b0385a49a5bd9bf081ed29e9fd', username='tempest-TestNetworkBasicOps-1355800406-project-member', projectid='829310cd8381494e96216dba067ff8d3', projectname='tempest-TestNetworkBasicOps-1355800406'), image=ImageMeta(id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "address": "fa:16:3e:a7:e9:1e", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c1979fb-e9", "ovs_interfaceid": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1769842028.5719008) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.584 182938 DEBUG nova.virt.libvirt.host [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.585 182938 DEBUG nova.virt.libvirt.host [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.589 182938 DEBUG nova.virt.libvirt.host [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.590 182938 DEBUG nova.virt.libvirt.host [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.590 182938 DEBUG nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.590 182938 DEBUG nova.virt.hardware [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T06:29:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9956992e-a3ca-497f-9747-3ae270e07def',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.591 182938 DEBUG nova.virt.hardware [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.591 182938 DEBUG nova.virt.hardware [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.591 182938 DEBUG nova.virt.hardware [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.592 182938 DEBUG nova.virt.hardware [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.592 182938 DEBUG nova.virt.hardware [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.592 182938 DEBUG nova.virt.hardware [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.592 182938 DEBUG nova.virt.hardware [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.593 182938 DEBUG nova.virt.hardware [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.593 182938 DEBUG nova.virt.hardware [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.593 182938 DEBUG nova.virt.hardware [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.597 182938 DEBUG nova.virt.libvirt.vif [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:46:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-794782075',display_name='tempest-TestNetworkBasicOps-server-794782075',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-794782075',id=11,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE2josOk/g7wyoAw5iRuxWJY/V6rjRWyBU/8lsm36Dp437FCLAzteof7QKaa4WajgP5R2MmljJavY4Zh0uXVaRbYiaXjUL7K+61WxHZHp8+wGYv0/oOjd9Rd1zJ/UsE5PQ==',key_name='tempest-TestNetworkBasicOps-248708636',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-c8g2fcmv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:46:58Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=5b239873-eca6-4fdc-b15d-2801c75cafa9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "address": "fa:16:3e:a7:e9:1e", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c1979fb-e9", "ovs_interfaceid": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.598 182938 DEBUG nova.network.os_vif_util [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "address": "fa:16:3e:a7:e9:1e", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c1979fb-e9", "ovs_interfaceid": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.599 182938 DEBUG nova.network.os_vif_util [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:e9:1e,bridge_name='br-int',has_traffic_filtering=True,id=3c1979fb-e961-4676-b6c2-2ca71f2d859b,network=Network(1eee62de-b98b-4359-b27b-63fb0219f31c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c1979fb-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.599 182938 DEBUG nova.objects.instance [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5b239873-eca6-4fdc-b15d-2801c75cafa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:47:08 np0005603500 nova_compute[182934]: 2026-01-31 06:47:08.882 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.107 182938 DEBUG nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] End _get_guest_xml xml=<domain type="kvm">
Jan 31 01:47:09 np0005603500 nova_compute[182934]:  <uuid>5b239873-eca6-4fdc-b15d-2801c75cafa9</uuid>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:  <name>instance-0000000b</name>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:  <memory>131072</memory>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:  <vcpu>1</vcpu>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <nova:name>tempest-TestNetworkBasicOps-server-794782075</nova:name>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <nova:creationTime>2026-01-31 06:47:08</nova:creationTime>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <nova:flavor name="m1.nano">
Jan 31 01:47:09 np0005603500 nova_compute[182934]:        <nova:memory>128</nova:memory>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:        <nova:disk>1</nova:disk>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:        <nova:swap>0</nova:swap>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:        <nova:vcpus>1</nova:vcpus>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      </nova:flavor>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <nova:owner>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:        <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:        <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      </nova:owner>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <nova:ports>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:        <nova:port uuid="3c1979fb-e961-4676-b6c2-2ca71f2d859b">
Jan 31 01:47:09 np0005603500 nova_compute[182934]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:        </nova:port>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      </nova:ports>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    </nova:instance>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:  <sysinfo type="smbios">
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <entry name="manufacturer">RDO</entry>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <entry name="product">OpenStack Compute</entry>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <entry name="serial">5b239873-eca6-4fdc-b15d-2801c75cafa9</entry>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <entry name="uuid">5b239873-eca6-4fdc-b15d-2801c75cafa9</entry>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <entry name="family">Virtual Machine</entry>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <boot dev="hd"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <smbios mode="sysinfo"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <vmcoreinfo/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:  <clock offset="utc">
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <timer name="hpet" present="no"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:  <cpu mode="host-model" match="exact">
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <disk type="file" device="disk">
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <target dev="vda" bus="virtio"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <disk type="file" device="cdrom">
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <driver name="qemu" type="raw" cache="none"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.config"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <target dev="sda" bus="sata"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <interface type="ethernet">
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <mac address="fa:16:3e:a7:e9:1e"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <mtu size="1442"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <target dev="tap3c1979fb-e9"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <serial type="pty">
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <log file="/var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/console.log" append="off"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <input type="tablet" bus="usb"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <rng model="virtio">
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <backend model="random">/dev/urandom</backend>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <controller type="usb" index="0"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    <memballoon model="virtio">
Jan 31 01:47:09 np0005603500 nova_compute[182934]:      <stats period="10"/>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:47:09 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:47:09 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:47:09 np0005603500 nova_compute[182934]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.108 182938 DEBUG nova.compute.manager [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Preparing to wait for external event network-vif-plugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.108 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.108 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.108 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.109 182938 DEBUG nova.virt.libvirt.vif [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:46:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-794782075',display_name='tempest-TestNetworkBasicOps-server-794782075',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-794782075',id=11,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE2josOk/g7wyoAw5iRuxWJY/V6rjRWyBU/8lsm36Dp437FCLAzteof7QKaa4WajgP5R2MmljJavY4Zh0uXVaRbYiaXjUL7K+61WxHZHp8+wGYv0/oOjd9Rd1zJ/UsE5PQ==',key_name='tempest-TestNetworkBasicOps-248708636',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-c8g2fcmv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:46:58Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=5b239873-eca6-4fdc-b15d-2801c75cafa9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "address": "fa:16:3e:a7:e9:1e", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c1979fb-e9", "ovs_interfaceid": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.109 182938 DEBUG nova.network.os_vif_util [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "address": "fa:16:3e:a7:e9:1e", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c1979fb-e9", "ovs_interfaceid": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.110 182938 DEBUG nova.network.os_vif_util [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:e9:1e,bridge_name='br-int',has_traffic_filtering=True,id=3c1979fb-e961-4676-b6c2-2ca71f2d859b,network=Network(1eee62de-b98b-4359-b27b-63fb0219f31c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c1979fb-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.110 182938 DEBUG os_vif [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:e9:1e,bridge_name='br-int',has_traffic_filtering=True,id=3c1979fb-e961-4676-b6c2-2ca71f2d859b,network=Network(1eee62de-b98b-4359-b27b-63fb0219f31c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c1979fb-e9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.111 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.111 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.111 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.112 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.112 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'f0f49872-dae5-5c05-a653-69db9d3833f9', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.113 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.115 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.117 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.117 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c1979fb-e9, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.118 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap3c1979fb-e9, col_values=(('qos', UUID('9de26e0d-3a1a-4a13-bb87-8b27e33dec40')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.118 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap3c1979fb-e9, col_values=(('external_ids', {'iface-id': '3c1979fb-e961-4676-b6c2-2ca71f2d859b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a7:e9:1e', 'vm-uuid': '5b239873-eca6-4fdc-b15d-2801c75cafa9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.119 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:09 np0005603500 NetworkManager[55506]: <info>  [1769842029.1202] manager: (tap3c1979fb-e9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.121 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.124 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:09 np0005603500 nova_compute[182934]: 2026-01-31 06:47:09.124 182938 INFO os_vif [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:e9:1e,bridge_name='br-int',has_traffic_filtering=True,id=3c1979fb-e961-4676-b6c2-2ca71f2d859b,network=Network(1eee62de-b98b-4359-b27b-63fb0219f31c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c1979fb-e9')
Jan 31 01:47:10 np0005603500 nova_compute[182934]: 2026-01-31 06:47:10.662 182938 DEBUG nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:47:10 np0005603500 nova_compute[182934]: 2026-01-31 06:47:10.663 182938 DEBUG nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:47:10 np0005603500 nova_compute[182934]: 2026-01-31 06:47:10.663 182938 DEBUG nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No VIF found with MAC fa:16:3e:a7:e9:1e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Jan 31 01:47:10 np0005603500 nova_compute[182934]: 2026-01-31 06:47:10.664 182938 INFO nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Using config drive
Jan 31 01:47:12 np0005603500 nova_compute[182934]: 2026-01-31 06:47:12.388 182938 INFO nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Creating config drive at /var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.config
Jan 31 01:47:12 np0005603500 nova_compute[182934]: 2026-01-31 06:47:12.392 182938 DEBUG oslo_concurrency.processutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmp6rsmv3x8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:47:12 np0005603500 nova_compute[182934]: 2026-01-31 06:47:12.511 182938 DEBUG oslo_concurrency.processutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmp6rsmv3x8" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:47:12 np0005603500 kernel: tap3c1979fb-e9: entered promiscuous mode
Jan 31 01:47:12 np0005603500 NetworkManager[55506]: <info>  [1769842032.5532] manager: (tap3c1979fb-e9): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Jan 31 01:47:12 np0005603500 ovn_controller[95398]: 2026-01-31T06:47:12Z|00166|binding|INFO|Claiming lport 3c1979fb-e961-4676-b6c2-2ca71f2d859b for this chassis.
Jan 31 01:47:12 np0005603500 ovn_controller[95398]: 2026-01-31T06:47:12Z|00167|binding|INFO|3c1979fb-e961-4676-b6c2-2ca71f2d859b: Claiming fa:16:3e:a7:e9:1e 10.100.0.7
Jan 31 01:47:12 np0005603500 nova_compute[182934]: 2026-01-31 06:47:12.553 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:12 np0005603500 nova_compute[182934]: 2026-01-31 06:47:12.558 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:12 np0005603500 nova_compute[182934]: 2026-01-31 06:47:12.560 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:12 np0005603500 systemd-machined[154375]: New machine qemu-11-instance-0000000b.
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.583 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:e9:1e 10.100.0.7'], port_security=['fa:16:3e:a7:e9:1e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '5b239873-eca6-4fdc-b15d-2801c75cafa9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1eee62de-b98b-4359-b27b-63fb0219f31c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '68c99352-a463-4f0e-8dbc-014b6eb7e45f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=055d2587-07d8-48d2-bdce-e8f3e4584c68, chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=3c1979fb-e961-4676-b6c2-2ca71f2d859b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.584 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 3c1979fb-e961-4676-b6c2-2ca71f2d859b in datapath 1eee62de-b98b-4359-b27b-63fb0219f31c bound to our chassis
Jan 31 01:47:12 np0005603500 ovn_controller[95398]: 2026-01-31T06:47:12Z|00168|binding|INFO|Setting lport 3c1979fb-e961-4676-b6c2-2ca71f2d859b ovn-installed in OVS
Jan 31 01:47:12 np0005603500 ovn_controller[95398]: 2026-01-31T06:47:12Z|00169|binding|INFO|Setting lport 3c1979fb-e961-4676-b6c2-2ca71f2d859b up in Southbound
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.586 104644 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1eee62de-b98b-4359-b27b-63fb0219f31c
Jan 31 01:47:12 np0005603500 systemd[1]: Started Virtual Machine qemu-11-instance-0000000b.
Jan 31 01:47:12 np0005603500 nova_compute[182934]: 2026-01-31 06:47:12.589 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:12 np0005603500 systemd-udevd[218497]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.596 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[789641e5-cfd2-4648-9601-6f7bee047f60]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.597 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1eee62de-b1 in ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.600 210946 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1eee62de-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.600 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[f83ec23d-ac8b-4b16-8eac-ee4ff5fd69b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:12 np0005603500 NetworkManager[55506]: <info>  [1769842032.6073] device (tap3c1979fb-e9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.606 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[66385236-aca0-421f-9444-61e20eaa15ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:12 np0005603500 NetworkManager[55506]: <info>  [1769842032.6079] device (tap3c1979fb-e9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.613 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[21652d71-e6b5-4735-84e5-64d045c84c26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.632 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[5225fe74-1594-4d31-866d-9d8dae9da873]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.658 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[718f2d74-4596-4251-a990-5c93d5ef634a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:12 np0005603500 systemd-udevd[218499]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.663 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[430127c5-b14b-4c32-8ed6-47665f7c3f2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:12 np0005603500 NetworkManager[55506]: <info>  [1769842032.6647] manager: (tap1eee62de-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/80)
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.683 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[82bf938c-746a-40d5-8ddc-6da826885a8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.685 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[2d5f18f7-2925-40ad-9ca5-7b14d9ec0d96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:12 np0005603500 NetworkManager[55506]: <info>  [1769842032.6990] device (tap1eee62de-b0): carrier: link connected
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.702 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[8d5a5ff9-8edb-4eb9-b30f-313a16696bc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.715 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[bcbbfffc-2410-4695-a93d-7fc11ecfeedd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1eee62de-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:54:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434399, 'reachable_time': 44609, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218529, 'error': None, 'target': 'ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.728 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[b98c6cfc-b863-4e3a-8007-63849682d4ef]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec4:54b8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434399, 'tstamp': 434399}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218530, 'error': None, 'target': 'ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.739 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[03db8fb1-7879-45dc-910d-9057129705da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1eee62de-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:54:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434399, 'reachable_time': 44609, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218532, 'error': None, 'target': 'ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.760 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[6f8de808-7f61-4e61-b9bf-b1457b5538ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.798 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[cd08c560-80c5-4f7b-a9ad-c776d37723c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.799 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1eee62de-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.799 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.800 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1eee62de-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:47:12 np0005603500 nova_compute[182934]: 2026-01-31 06:47:12.801 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:12 np0005603500 NetworkManager[55506]: <info>  [1769842032.8018] manager: (tap1eee62de-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Jan 31 01:47:12 np0005603500 kernel: tap1eee62de-b0: entered promiscuous mode
Jan 31 01:47:12 np0005603500 nova_compute[182934]: 2026-01-31 06:47:12.803 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.804 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1eee62de-b0, col_values=(('external_ids', {'iface-id': '060d8bd6-d243-4a7c-b3cb-d0f6dfd68585'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:47:12 np0005603500 nova_compute[182934]: 2026-01-31 06:47:12.805 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:12 np0005603500 ovn_controller[95398]: 2026-01-31T06:47:12Z|00170|binding|INFO|Releasing lport 060d8bd6-d243-4a7c-b3cb-d0f6dfd68585 from this chassis (sb_readonly=0)
Jan 31 01:47:12 np0005603500 nova_compute[182934]: 2026-01-31 06:47:12.806 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.808 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[c6f59c9f-d6dc-43f5-9742-26190904c16a]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:12 np0005603500 nova_compute[182934]: 2026-01-31 06:47:12.809 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.809 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1eee62de-b98b-4359-b27b-63fb0219f31c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1eee62de-b98b-4359-b27b-63fb0219f31c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.809 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1eee62de-b98b-4359-b27b-63fb0219f31c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1eee62de-b98b-4359-b27b-63fb0219f31c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.810 104644 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 1eee62de-b98b-4359-b27b-63fb0219f31c disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.810 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1eee62de-b98b-4359-b27b-63fb0219f31c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1eee62de-b98b-4359-b27b-63fb0219f31c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.810 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[50398fc0-6878-48ea-bd26-99a96aa772a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.811 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1eee62de-b98b-4359-b27b-63fb0219f31c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1eee62de-b98b-4359-b27b-63fb0219f31c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.811 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[d29a430e-8ed6-4731-a3ad-90286b29168f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.811 104644 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: global
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    log         /dev/log local0 debug
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    log-tag     haproxy-metadata-proxy-1eee62de-b98b-4359-b27b-63fb0219f31c
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    user        root
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    group       root
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    maxconn     1024
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    pidfile     /var/lib/neutron/external/pids/1eee62de-b98b-4359-b27b-63fb0219f31c.pid.haproxy
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    daemon
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: defaults
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    log global
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    mode http
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    option httplog
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    option dontlognull
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    option http-server-close
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    option forwardfor
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    retries                 3
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    timeout http-request    30s
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    timeout connect         30s
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    timeout client          32s
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    timeout server          32s
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    timeout http-keep-alive 30s
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: listen listener
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    bind 169.254.169.254:80
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]:    http-request add-header X-OVN-Network-ID 1eee62de-b98b-4359-b27b-63fb0219f31c
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 31 01:47:12 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:12.812 104644 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c', 'env', 'PROCESS_TAG=haproxy-1eee62de-b98b-4359-b27b-63fb0219f31c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1eee62de-b98b-4359-b27b-63fb0219f31c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Jan 31 01:47:13 np0005603500 podman[218570]: 2026-01-31 06:47:13.139996842 +0000 UTC m=+0.050609339 container create 9e2ec233d5bc77e5c534487049e37914f6e196b6d60e06dde3f40e19acef4b62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:47:13 np0005603500 systemd[1]: Started libpod-conmon-9e2ec233d5bc77e5c534487049e37914f6e196b6d60e06dde3f40e19acef4b62.scope.
Jan 31 01:47:13 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:47:13 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02739d5386649c30f52c3bdde2be6af934ef7aa6c2ce2ef9d37e5b16957b2e5f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 01:47:13 np0005603500 podman[218570]: 2026-01-31 06:47:13.114150252 +0000 UTC m=+0.024762769 image pull d52ce0b189025039ce86fc9564595bcce243e95c598f912f021ea09cd4116a16 quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:47:13 np0005603500 podman[218570]: 2026-01-31 06:47:13.211486974 +0000 UTC m=+0.122099481 container init 9e2ec233d5bc77e5c534487049e37914f6e196b6d60e06dde3f40e19acef4b62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 31 01:47:13 np0005603500 podman[218570]: 2026-01-31 06:47:13.215042737 +0000 UTC m=+0.125655234 container start 9e2ec233d5bc77e5c534487049e37914f6e196b6d60e06dde3f40e19acef4b62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 31 01:47:13 np0005603500 neutron-haproxy-ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c[218585]: [NOTICE]   (218589) : New worker (218591) forked
Jan 31 01:47:13 np0005603500 neutron-haproxy-ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c[218585]: [NOTICE]   (218589) : Loading success.
Jan 31 01:47:13 np0005603500 nova_compute[182934]: 2026-01-31 06:47:13.304 182938 DEBUG nova.compute.manager [req-225df2a2-d05e-4f6f-ac94-0c0e71342d3c req-827db2de-a7df-4ba5-b482-6a141d396857 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received event network-vif-plugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:47:13 np0005603500 nova_compute[182934]: 2026-01-31 06:47:13.305 182938 DEBUG oslo_concurrency.lockutils [req-225df2a2-d05e-4f6f-ac94-0c0e71342d3c req-827db2de-a7df-4ba5-b482-6a141d396857 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:47:13 np0005603500 nova_compute[182934]: 2026-01-31 06:47:13.306 182938 DEBUG oslo_concurrency.lockutils [req-225df2a2-d05e-4f6f-ac94-0c0e71342d3c req-827db2de-a7df-4ba5-b482-6a141d396857 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:47:13 np0005603500 nova_compute[182934]: 2026-01-31 06:47:13.306 182938 DEBUG oslo_concurrency.lockutils [req-225df2a2-d05e-4f6f-ac94-0c0e71342d3c req-827db2de-a7df-4ba5-b482-6a141d396857 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:47:13 np0005603500 nova_compute[182934]: 2026-01-31 06:47:13.306 182938 DEBUG nova.compute.manager [req-225df2a2-d05e-4f6f-ac94-0c0e71342d3c req-827db2de-a7df-4ba5-b482-6a141d396857 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Processing event network-vif-plugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Jan 31 01:47:13 np0005603500 nova_compute[182934]: 2026-01-31 06:47:13.307 182938 DEBUG nova.compute.manager [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Jan 31 01:47:13 np0005603500 nova_compute[182934]: 2026-01-31 06:47:13.312 182938 DEBUG nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Jan 31 01:47:13 np0005603500 nova_compute[182934]: 2026-01-31 06:47:13.320 182938 INFO nova.virt.libvirt.driver [-] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Instance spawned successfully.
Jan 31 01:47:13 np0005603500 nova_compute[182934]: 2026-01-31 06:47:13.321 182938 DEBUG nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Jan 31 01:47:13 np0005603500 nova_compute[182934]: 2026-01-31 06:47:13.831 182938 DEBUG nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:47:13 np0005603500 nova_compute[182934]: 2026-01-31 06:47:13.832 182938 DEBUG nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:47:13 np0005603500 nova_compute[182934]: 2026-01-31 06:47:13.833 182938 DEBUG nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:47:13 np0005603500 nova_compute[182934]: 2026-01-31 06:47:13.833 182938 DEBUG nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:47:13 np0005603500 nova_compute[182934]: 2026-01-31 06:47:13.834 182938 DEBUG nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:47:13 np0005603500 nova_compute[182934]: 2026-01-31 06:47:13.834 182938 DEBUG nova.virt.libvirt.driver [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:47:13 np0005603500 nova_compute[182934]: 2026-01-31 06:47:13.885 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:14 np0005603500 nova_compute[182934]: 2026-01-31 06:47:14.119 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:14 np0005603500 nova_compute[182934]: 2026-01-31 06:47:14.344 182938 INFO nova.compute.manager [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Took 14.94 seconds to spawn the instance on the hypervisor.
Jan 31 01:47:14 np0005603500 nova_compute[182934]: 2026-01-31 06:47:14.344 182938 DEBUG nova.compute.manager [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Jan 31 01:47:14 np0005603500 nova_compute[182934]: 2026-01-31 06:47:14.864 182938 INFO nova.compute.manager [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Took 20.22 seconds to build instance.
Jan 31 01:47:15 np0005603500 nova_compute[182934]: 2026-01-31 06:47:15.371 182938 DEBUG oslo_concurrency.lockutils [None req-0e929bff-03ec-4799-9684-8c7617b114df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.832s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:47:15 np0005603500 nova_compute[182934]: 2026-01-31 06:47:15.988 182938 DEBUG nova.compute.manager [req-82c89d5e-0537-4665-989f-c7cac2c113f2 req-e4c5089b-5d39-42c0-9f8c-2b94b93d0100 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received event network-vif-plugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:47:15 np0005603500 nova_compute[182934]: 2026-01-31 06:47:15.989 182938 DEBUG oslo_concurrency.lockutils [req-82c89d5e-0537-4665-989f-c7cac2c113f2 req-e4c5089b-5d39-42c0-9f8c-2b94b93d0100 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:47:15 np0005603500 nova_compute[182934]: 2026-01-31 06:47:15.989 182938 DEBUG oslo_concurrency.lockutils [req-82c89d5e-0537-4665-989f-c7cac2c113f2 req-e4c5089b-5d39-42c0-9f8c-2b94b93d0100 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:47:15 np0005603500 nova_compute[182934]: 2026-01-31 06:47:15.990 182938 DEBUG oslo_concurrency.lockutils [req-82c89d5e-0537-4665-989f-c7cac2c113f2 req-e4c5089b-5d39-42c0-9f8c-2b94b93d0100 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:47:15 np0005603500 nova_compute[182934]: 2026-01-31 06:47:15.990 182938 DEBUG nova.compute.manager [req-82c89d5e-0537-4665-989f-c7cac2c113f2 req-e4c5089b-5d39-42c0-9f8c-2b94b93d0100 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] No waiting events found dispatching network-vif-plugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:47:15 np0005603500 nova_compute[182934]: 2026-01-31 06:47:15.990 182938 WARNING nova.compute.manager [req-82c89d5e-0537-4665-989f-c7cac2c113f2 req-e4c5089b-5d39-42c0-9f8c-2b94b93d0100 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received unexpected event network-vif-plugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b for instance with vm_state active and task_state None.
Jan 31 01:47:16 np0005603500 podman[218601]: 2026-01-31 06:47:16.136433358 +0000 UTC m=+0.050064651 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1769056855, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, vcs-type=git, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 31 01:47:16 np0005603500 podman[218600]: 2026-01-31 06:47:16.179743544 +0000 UTC m=+0.093511602 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:47:18 np0005603500 nova_compute[182934]: 2026-01-31 06:47:18.886 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:19 np0005603500 nova_compute[182934]: 2026-01-31 06:47:19.121 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:21 np0005603500 nova_compute[182934]: 2026-01-31 06:47:21.463 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:21 np0005603500 ovn_controller[95398]: 2026-01-31T06:47:21Z|00171|binding|INFO|Releasing lport 060d8bd6-d243-4a7c-b3cb-d0f6dfd68585 from this chassis (sb_readonly=0)
Jan 31 01:47:21 np0005603500 NetworkManager[55506]: <info>  [1769842041.4651] manager: (patch-br-int-to-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Jan 31 01:47:21 np0005603500 NetworkManager[55506]: <info>  [1769842041.4657] manager: (patch-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Jan 31 01:47:21 np0005603500 ovn_controller[95398]: 2026-01-31T06:47:21Z|00172|binding|INFO|Releasing lport 060d8bd6-d243-4a7c-b3cb-d0f6dfd68585 from this chassis (sb_readonly=0)
Jan 31 01:47:21 np0005603500 nova_compute[182934]: 2026-01-31 06:47:21.471 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:21 np0005603500 nova_compute[182934]: 2026-01-31 06:47:21.476 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:22 np0005603500 nova_compute[182934]: 2026-01-31 06:47:22.065 182938 DEBUG nova.compute.manager [req-757a55c2-ba82-407c-a735-46e0305dc816 req-8a346f36-6d6d-4951-bfe3-245fa0af9df7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received event network-changed-3c1979fb-e961-4676-b6c2-2ca71f2d859b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:47:22 np0005603500 nova_compute[182934]: 2026-01-31 06:47:22.066 182938 DEBUG nova.compute.manager [req-757a55c2-ba82-407c-a735-46e0305dc816 req-8a346f36-6d6d-4951-bfe3-245fa0af9df7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Refreshing instance network info cache due to event network-changed-3c1979fb-e961-4676-b6c2-2ca71f2d859b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:47:22 np0005603500 nova_compute[182934]: 2026-01-31 06:47:22.067 182938 DEBUG oslo_concurrency.lockutils [req-757a55c2-ba82-407c-a735-46e0305dc816 req-8a346f36-6d6d-4951-bfe3-245fa0af9df7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-5b239873-eca6-4fdc-b15d-2801c75cafa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:47:22 np0005603500 nova_compute[182934]: 2026-01-31 06:47:22.067 182938 DEBUG oslo_concurrency.lockutils [req-757a55c2-ba82-407c-a735-46e0305dc816 req-8a346f36-6d6d-4951-bfe3-245fa0af9df7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-5b239873-eca6-4fdc-b15d-2801c75cafa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:47:22 np0005603500 nova_compute[182934]: 2026-01-31 06:47:22.067 182938 DEBUG nova.network.neutron [req-757a55c2-ba82-407c-a735-46e0305dc816 req-8a346f36-6d6d-4951-bfe3-245fa0af9df7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Refreshing network info cache for port 3c1979fb-e961-4676-b6c2-2ca71f2d859b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:47:23 np0005603500 nova_compute[182934]: 2026-01-31 06:47:23.888 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:24 np0005603500 nova_compute[182934]: 2026-01-31 06:47:24.124 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:24 np0005603500 podman[218666]: 2026-01-31 06:47:24.14477158 +0000 UTC m=+0.058350845 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 01:47:24 np0005603500 podman[218667]: 2026-01-31 06:47:24.164863878 +0000 UTC m=+0.070663195 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 31 01:47:24 np0005603500 ovn_controller[95398]: 2026-01-31T06:47:24Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a7:e9:1e 10.100.0.7
Jan 31 01:47:24 np0005603500 ovn_controller[95398]: 2026-01-31T06:47:24Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a7:e9:1e 10.100.0.7
Jan 31 01:47:27 np0005603500 nova_compute[182934]: 2026-01-31 06:47:27.062 182938 DEBUG nova.network.neutron [req-757a55c2-ba82-407c-a735-46e0305dc816 req-8a346f36-6d6d-4951-bfe3-245fa0af9df7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Updated VIF entry in instance network info cache for port 3c1979fb-e961-4676-b6c2-2ca71f2d859b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:47:27 np0005603500 nova_compute[182934]: 2026-01-31 06:47:27.063 182938 DEBUG nova.network.neutron [req-757a55c2-ba82-407c-a735-46e0305dc816 req-8a346f36-6d6d-4951-bfe3-245fa0af9df7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Updating instance_info_cache with network_info: [{"id": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "address": "fa:16:3e:a7:e9:1e", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c1979fb-e9", "ovs_interfaceid": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:47:27 np0005603500 nova_compute[182934]: 2026-01-31 06:47:27.685 182938 DEBUG oslo_concurrency.lockutils [req-757a55c2-ba82-407c-a735-46e0305dc816 req-8a346f36-6d6d-4951-bfe3-245fa0af9df7 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-5b239873-eca6-4fdc-b15d-2801c75cafa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:47:28 np0005603500 nova_compute[182934]: 2026-01-31 06:47:28.890 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:29 np0005603500 nova_compute[182934]: 2026-01-31 06:47:29.145 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:33 np0005603500 nova_compute[182934]: 2026-01-31 06:47:33.892 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:34 np0005603500 nova_compute[182934]: 2026-01-31 06:47:34.146 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:36 np0005603500 nova_compute[182934]: 2026-01-31 06:47:36.923 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "19b9866a-ffdf-4074-b605-12988cf688fa" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:47:36 np0005603500 nova_compute[182934]: 2026-01-31 06:47:36.924 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "19b9866a-ffdf-4074-b605-12988cf688fa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:47:37 np0005603500 nova_compute[182934]: 2026-01-31 06:47:37.430 182938 DEBUG nova.compute.manager [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Jan 31 01:47:37 np0005603500 nova_compute[182934]: 2026-01-31 06:47:37.969 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:47:37 np0005603500 nova_compute[182934]: 2026-01-31 06:47:37.970 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:47:37 np0005603500 nova_compute[182934]: 2026-01-31 06:47:37.978 182938 DEBUG nova.virt.hardware [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Jan 31 01:47:37 np0005603500 nova_compute[182934]: 2026-01-31 06:47:37.978 182938 INFO nova.compute.claims [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Claim successful on node compute-0.ctlplane.example.com
Jan 31 01:47:38 np0005603500 podman[218710]: 2026-01-31 06:47:38.118913953 +0000 UTC m=+0.040807667 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:47:38 np0005603500 podman[218711]: 2026-01-31 06:47:38.125025837 +0000 UTC m=+0.044320919 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 31 01:47:38 np0005603500 nova_compute[182934]: 2026-01-31 06:47:38.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:47:38 np0005603500 nova_compute[182934]: 2026-01-31 06:47:38.661 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:47:38 np0005603500 nova_compute[182934]: 2026-01-31 06:47:38.895 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:39 np0005603500 nova_compute[182934]: 2026-01-31 06:47:39.051 182938 DEBUG nova.compute.provider_tree [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:47:39 np0005603500 nova_compute[182934]: 2026-01-31 06:47:39.148 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:39 np0005603500 nova_compute[182934]: 2026-01-31 06:47:39.559 182938 DEBUG nova.scheduler.client.report [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:47:40 np0005603500 nova_compute[182934]: 2026-01-31 06:47:40.069 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:47:40 np0005603500 nova_compute[182934]: 2026-01-31 06:47:40.069 182938 DEBUG nova.compute.manager [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Jan 31 01:47:40 np0005603500 nova_compute[182934]: 2026-01-31 06:47:40.072 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 1.411s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:47:40 np0005603500 nova_compute[182934]: 2026-01-31 06:47:40.072 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:47:40 np0005603500 nova_compute[182934]: 2026-01-31 06:47:40.072 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:47:40 np0005603500 nova_compute[182934]: 2026-01-31 06:47:40.581 182938 DEBUG nova.compute.manager [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Jan 31 01:47:40 np0005603500 nova_compute[182934]: 2026-01-31 06:47:40.581 182938 DEBUG nova.network.neutron [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Jan 31 01:47:41 np0005603500 nova_compute[182934]: 2026-01-31 06:47:41.044 182938 DEBUG nova.policy [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '829310cd8381494e96216dba067ff8d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Jan 31 01:47:41 np0005603500 nova_compute[182934]: 2026-01-31 06:47:41.089 182938 INFO nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 31 01:47:41 np0005603500 nova_compute[182934]: 2026-01-31 06:47:41.129 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:47:41 np0005603500 nova_compute[182934]: 2026-01-31 06:47:41.174 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:47:41 np0005603500 nova_compute[182934]: 2026-01-31 06:47:41.175 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:47:41 np0005603500 nova_compute[182934]: 2026-01-31 06:47:41.223 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:47:41 np0005603500 nova_compute[182934]: 2026-01-31 06:47:41.358 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:47:41 np0005603500 nova_compute[182934]: 2026-01-31 06:47:41.360 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5604MB free_disk=73.18295669555664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:47:41 np0005603500 nova_compute[182934]: 2026-01-31 06:47:41.360 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:47:41 np0005603500 nova_compute[182934]: 2026-01-31 06:47:41.360 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:47:41 np0005603500 nova_compute[182934]: 2026-01-31 06:47:41.597 182938 DEBUG nova.compute.manager [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.406 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Instance 5b239873-eca6-4fdc-b15d-2801c75cafa9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.406 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Instance 19b9866a-ffdf-4074-b605-12988cf688fa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.407 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.407 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.463 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.477 182938 DEBUG nova.network.neutron [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Successfully created port: 93bc43c2-a00a-4c71-a3e9-b7b0306969c9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.615 182938 DEBUG nova.compute.manager [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.617 182938 DEBUG nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.617 182938 INFO nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Creating image(s)
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.618 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "/var/lib/nova/instances/19b9866a-ffdf-4074-b605-12988cf688fa/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.618 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/19b9866a-ffdf-4074-b605-12988cf688fa/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.619 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/19b9866a-ffdf-4074-b605-12988cf688fa/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.619 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.624 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.627 182938 DEBUG oslo_concurrency.processutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.688 182938 DEBUG oslo_concurrency.processutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.689 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "d9035e96dc857b84194c2a2b496d294827e2de39" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.689 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.691 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.696 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.697 182938 DEBUG oslo_concurrency.processutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.740 182938 DEBUG oslo_concurrency.processutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.741 182938 DEBUG oslo_concurrency.processutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/19b9866a-ffdf-4074-b605-12988cf688fa/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.769 182938 DEBUG oslo_concurrency.processutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/19b9866a-ffdf-4074-b605-12988cf688fa/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.770 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.770 182938 DEBUG oslo_concurrency.processutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.819 182938 DEBUG oslo_concurrency.processutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.819 182938 DEBUG nova.virt.disk.api [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Checking if we can resize image /var/lib/nova/instances/19b9866a-ffdf-4074-b605-12988cf688fa/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.820 182938 DEBUG oslo_concurrency.processutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19b9866a-ffdf-4074-b605-12988cf688fa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.861 182938 DEBUG oslo_concurrency.processutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19b9866a-ffdf-4074-b605-12988cf688fa/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.862 182938 DEBUG nova.virt.disk.api [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Cannot resize image /var/lib/nova/instances/19b9866a-ffdf-4074-b605-12988cf688fa/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.862 182938 DEBUG nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.862 182938 DEBUG nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Ensure instance console log exists: /var/lib/nova/instances/19b9866a-ffdf-4074-b605-12988cf688fa/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.863 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.863 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.863 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:47:42 np0005603500 nova_compute[182934]: 2026-01-31 06:47:42.982 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:47:43 np0005603500 nova_compute[182934]: 2026-01-31 06:47:43.491 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:47:43 np0005603500 nova_compute[182934]: 2026-01-31 06:47:43.491 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:47:43 np0005603500 nova_compute[182934]: 2026-01-31 06:47:43.528 182938 DEBUG nova.network.neutron [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Successfully updated port: 93bc43c2-a00a-4c71-a3e9-b7b0306969c9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 31 01:47:43 np0005603500 nova_compute[182934]: 2026-01-31 06:47:43.716 182938 DEBUG nova.compute.manager [req-327c8081-ee69-4b5b-9e24-288b447509d7 req-d07065ef-5f2e-4fa4-aec6-1da5910c596e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Received event network-changed-93bc43c2-a00a-4c71-a3e9-b7b0306969c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:47:43 np0005603500 nova_compute[182934]: 2026-01-31 06:47:43.716 182938 DEBUG nova.compute.manager [req-327c8081-ee69-4b5b-9e24-288b447509d7 req-d07065ef-5f2e-4fa4-aec6-1da5910c596e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Refreshing instance network info cache due to event network-changed-93bc43c2-a00a-4c71-a3e9-b7b0306969c9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:47:43 np0005603500 nova_compute[182934]: 2026-01-31 06:47:43.716 182938 DEBUG oslo_concurrency.lockutils [req-327c8081-ee69-4b5b-9e24-288b447509d7 req-d07065ef-5f2e-4fa4-aec6-1da5910c596e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-19b9866a-ffdf-4074-b605-12988cf688fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:47:43 np0005603500 nova_compute[182934]: 2026-01-31 06:47:43.716 182938 DEBUG oslo_concurrency.lockutils [req-327c8081-ee69-4b5b-9e24-288b447509d7 req-d07065ef-5f2e-4fa4-aec6-1da5910c596e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-19b9866a-ffdf-4074-b605-12988cf688fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:47:43 np0005603500 nova_compute[182934]: 2026-01-31 06:47:43.716 182938 DEBUG nova.network.neutron [req-327c8081-ee69-4b5b-9e24-288b447509d7 req-d07065ef-5f2e-4fa4-aec6-1da5910c596e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Refreshing network info cache for port 93bc43c2-a00a-4c71-a3e9-b7b0306969c9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:47:43 np0005603500 nova_compute[182934]: 2026-01-31 06:47:43.895 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:44 np0005603500 nova_compute[182934]: 2026-01-31 06:47:44.034 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "refresh_cache-19b9866a-ffdf-4074-b605-12988cf688fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:47:44 np0005603500 nova_compute[182934]: 2026-01-31 06:47:44.149 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:44 np0005603500 nova_compute[182934]: 2026-01-31 06:47:44.493 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:47:45 np0005603500 nova_compute[182934]: 2026-01-31 06:47:45.002 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:47:45 np0005603500 nova_compute[182934]: 2026-01-31 06:47:45.002 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:47:45 np0005603500 nova_compute[182934]: 2026-01-31 06:47:45.002 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:47:45 np0005603500 nova_compute[182934]: 2026-01-31 06:47:45.002 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:47:45 np0005603500 nova_compute[182934]: 2026-01-31 06:47:45.003 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:47:45 np0005603500 nova_compute[182934]: 2026-01-31 06:47:45.039 182938 DEBUG nova.network.neutron [req-327c8081-ee69-4b5b-9e24-288b447509d7 req-d07065ef-5f2e-4fa4-aec6-1da5910c596e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:47:45 np0005603500 nova_compute[182934]: 2026-01-31 06:47:45.652 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:47:46 np0005603500 nova_compute[182934]: 2026-01-31 06:47:46.013 182938 DEBUG nova.network.neutron [req-327c8081-ee69-4b5b-9e24-288b447509d7 req-d07065ef-5f2e-4fa4-aec6-1da5910c596e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:47:46 np0005603500 nova_compute[182934]: 2026-01-31 06:47:46.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:47:46 np0005603500 nova_compute[182934]: 2026-01-31 06:47:46.519 182938 DEBUG oslo_concurrency.lockutils [req-327c8081-ee69-4b5b-9e24-288b447509d7 req-d07065ef-5f2e-4fa4-aec6-1da5910c596e 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-19b9866a-ffdf-4074-b605-12988cf688fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:47:46 np0005603500 nova_compute[182934]: 2026-01-31 06:47:46.520 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquired lock "refresh_cache-19b9866a-ffdf-4074-b605-12988cf688fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:47:46 np0005603500 nova_compute[182934]: 2026-01-31 06:47:46.520 182938 DEBUG nova.network.neutron [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Jan 31 01:47:47 np0005603500 podman[218775]: 2026-01-31 06:47:47.136318591 +0000 UTC m=+0.050383932 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, com.redhat.component=ubi9-minimal-container, build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git)
Jan 31 01:47:47 np0005603500 podman[218774]: 2026-01-31 06:47:47.139313346 +0000 UTC m=+0.061128003 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 01:47:47 np0005603500 nova_compute[182934]: 2026-01-31 06:47:47.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:47:47 np0005603500 nova_compute[182934]: 2026-01-31 06:47:47.979 182938 DEBUG nova.network.neutron [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:47:48 np0005603500 nova_compute[182934]: 2026-01-31 06:47:48.898 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:49 np0005603500 nova_compute[182934]: 2026-01-31 06:47:49.151 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.055 182938 DEBUG nova.network.neutron [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Updating instance_info_cache with network_info: [{"id": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "address": "fa:16:3e:b3:c9:98", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93bc43c2-a0", "ovs_interfaceid": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.565 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Releasing lock "refresh_cache-19b9866a-ffdf-4074-b605-12988cf688fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.566 182938 DEBUG nova.compute.manager [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Instance network_info: |[{"id": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "address": "fa:16:3e:b3:c9:98", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93bc43c2-a0", "ovs_interfaceid": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.568 182938 DEBUG nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Start _get_guest_xml network_info=[{"id": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "address": "fa:16:3e:b3:c9:98", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93bc43c2-a0", "ovs_interfaceid": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': '9f613975-b701-42a0-9b35-7d5c4a2cb7f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.572 182938 WARNING nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.574 182938 DEBUG nova.virt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-261222791', uuid='19b9866a-ffdf-4074-b605-12988cf688fa'), owner=OwnerMeta(userid='dddc34b0385a49a5bd9bf081ed29e9fd', username='tempest-TestNetworkBasicOps-1355800406-project-member', projectid='829310cd8381494e96216dba067ff8d3', projectname='tempest-TestNetworkBasicOps-1355800406'), image=ImageMeta(id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "address": "fa:16:3e:b3:c9:98", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93bc43c2-a0", "ovs_interfaceid": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1769842071.574347) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.580 182938 DEBUG nova.virt.libvirt.host [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.580 182938 DEBUG nova.virt.libvirt.host [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.582 182938 DEBUG nova.virt.libvirt.host [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.583 182938 DEBUG nova.virt.libvirt.host [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.583 182938 DEBUG nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.583 182938 DEBUG nova.virt.hardware [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T06:29:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9956992e-a3ca-497f-9747-3ae270e07def',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.584 182938 DEBUG nova.virt.hardware [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.584 182938 DEBUG nova.virt.hardware [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.585 182938 DEBUG nova.virt.hardware [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.585 182938 DEBUG nova.virt.hardware [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.585 182938 DEBUG nova.virt.hardware [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.585 182938 DEBUG nova.virt.hardware [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.586 182938 DEBUG nova.virt.hardware [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.586 182938 DEBUG nova.virt.hardware [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.586 182938 DEBUG nova.virt.hardware [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.587 182938 DEBUG nova.virt.hardware [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.590 182938 DEBUG nova.virt.libvirt.vif [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:47:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-261222791',display_name='tempest-TestNetworkBasicOps-server-261222791',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-261222791',id=12,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF5vqaP6nGopzb91aDDJXpx7Ly0Sxm9oF3ehyWfdSEp3B+jf7SAETxAGTmuToCpQrGXRtrRyn0kdZQy8LxdkDc0vxQzNswubsYs/PS1WzAwgpZyYXs3ZFT4RCs1XHbyqYw==',key_name='tempest-TestNetworkBasicOps-133039895',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-v3fcf8ki',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:47:41Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=19b9866a-ffdf-4074-b605-12988cf688fa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "address": "fa:16:3e:b3:c9:98", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93bc43c2-a0", "ovs_interfaceid": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.590 182938 DEBUG nova.network.os_vif_util [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "address": "fa:16:3e:b3:c9:98", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93bc43c2-a0", "ovs_interfaceid": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.591 182938 DEBUG nova.network.os_vif_util [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:c9:98,bridge_name='br-int',has_traffic_filtering=True,id=93bc43c2-a00a-4c71-a3e9-b7b0306969c9,network=Network(1eee62de-b98b-4359-b27b-63fb0219f31c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93bc43c2-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:47:51 np0005603500 nova_compute[182934]: 2026-01-31 06:47:51.592 182938 DEBUG nova.objects.instance [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 19b9866a-ffdf-4074-b605-12988cf688fa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.106 182938 DEBUG nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] End _get_guest_xml xml=<domain type="kvm">
Jan 31 01:47:52 np0005603500 nova_compute[182934]:  <uuid>19b9866a-ffdf-4074-b605-12988cf688fa</uuid>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:  <name>instance-0000000c</name>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:  <memory>131072</memory>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:  <vcpu>1</vcpu>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <nova:name>tempest-TestNetworkBasicOps-server-261222791</nova:name>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <nova:creationTime>2026-01-31 06:47:51</nova:creationTime>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <nova:flavor name="m1.nano">
Jan 31 01:47:52 np0005603500 nova_compute[182934]:        <nova:memory>128</nova:memory>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:        <nova:disk>1</nova:disk>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:        <nova:swap>0</nova:swap>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:        <nova:vcpus>1</nova:vcpus>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      </nova:flavor>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <nova:owner>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:        <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:        <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      </nova:owner>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <nova:ports>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:        <nova:port uuid="93bc43c2-a00a-4c71-a3e9-b7b0306969c9">
Jan 31 01:47:52 np0005603500 nova_compute[182934]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:        </nova:port>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      </nova:ports>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    </nova:instance>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:  <sysinfo type="smbios">
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <entry name="manufacturer">RDO</entry>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <entry name="product">OpenStack Compute</entry>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <entry name="serial">19b9866a-ffdf-4074-b605-12988cf688fa</entry>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <entry name="uuid">19b9866a-ffdf-4074-b605-12988cf688fa</entry>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <entry name="family">Virtual Machine</entry>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <boot dev="hd"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <smbios mode="sysinfo"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <vmcoreinfo/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:  <clock offset="utc">
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <timer name="hpet" present="no"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:  <cpu mode="host-model" match="exact">
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <disk type="file" device="disk">
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/19b9866a-ffdf-4074-b605-12988cf688fa/disk"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <target dev="vda" bus="virtio"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <disk type="file" device="cdrom">
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <driver name="qemu" type="raw" cache="none"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/19b9866a-ffdf-4074-b605-12988cf688fa/disk.config"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <target dev="sda" bus="sata"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <interface type="ethernet">
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <mac address="fa:16:3e:b3:c9:98"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <mtu size="1442"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <target dev="tap93bc43c2-a0"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <serial type="pty">
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <log file="/var/lib/nova/instances/19b9866a-ffdf-4074-b605-12988cf688fa/console.log" append="off"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <input type="tablet" bus="usb"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <rng model="virtio">
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <backend model="random">/dev/urandom</backend>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <controller type="usb" index="0"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    <memballoon model="virtio">
Jan 31 01:47:52 np0005603500 nova_compute[182934]:      <stats period="10"/>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:47:52 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:47:52 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:47:52 np0005603500 nova_compute[182934]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.108 182938 DEBUG nova.compute.manager [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Preparing to wait for external event network-vif-plugged-93bc43c2-a00a-4c71-a3e9-b7b0306969c9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.108 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "19b9866a-ffdf-4074-b605-12988cf688fa-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.108 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "19b9866a-ffdf-4074-b605-12988cf688fa-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.108 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "19b9866a-ffdf-4074-b605-12988cf688fa-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.109 182938 DEBUG nova.virt.libvirt.vif [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:47:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-261222791',display_name='tempest-TestNetworkBasicOps-server-261222791',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-261222791',id=12,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF5vqaP6nGopzb91aDDJXpx7Ly0Sxm9oF3ehyWfdSEp3B+jf7SAETxAGTmuToCpQrGXRtrRyn0kdZQy8LxdkDc0vxQzNswubsYs/PS1WzAwgpZyYXs3ZFT4RCs1XHbyqYw==',key_name='tempest-TestNetworkBasicOps-133039895',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-v3fcf8ki',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:47:41Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=19b9866a-ffdf-4074-b605-12988cf688fa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "address": "fa:16:3e:b3:c9:98", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93bc43c2-a0", "ovs_interfaceid": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.109 182938 DEBUG nova.network.os_vif_util [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "address": "fa:16:3e:b3:c9:98", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93bc43c2-a0", "ovs_interfaceid": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.110 182938 DEBUG nova.network.os_vif_util [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:c9:98,bridge_name='br-int',has_traffic_filtering=True,id=93bc43c2-a00a-4c71-a3e9-b7b0306969c9,network=Network(1eee62de-b98b-4359-b27b-63fb0219f31c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93bc43c2-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.111 182938 DEBUG os_vif [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:c9:98,bridge_name='br-int',has_traffic_filtering=True,id=93bc43c2-a00a-4c71-a3e9-b7b0306969c9,network=Network(1eee62de-b98b-4359-b27b-63fb0219f31c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93bc43c2-a0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.111 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.111 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.112 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.113 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.113 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '8c0aa471-7d94-5128-aa2e-21e8f2c70400', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.115 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.119 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.119 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap93bc43c2-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.120 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap93bc43c2-a0, col_values=(('qos', UUID('f3993b33-da9a-47a2-ba7f-6f8d5bef1ed4')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.120 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap93bc43c2-a0, col_values=(('external_ids', {'iface-id': '93bc43c2-a00a-4c71-a3e9-b7b0306969c9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b3:c9:98', 'vm-uuid': '19b9866a-ffdf-4074-b605-12988cf688fa'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.122 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:52 np0005603500 NetworkManager[55506]: <info>  [1769842072.1232] manager: (tap93bc43c2-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.125 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.128 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:52 np0005603500 nova_compute[182934]: 2026-01-31 06:47:52.129 182938 INFO os_vif [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:c9:98,bridge_name='br-int',has_traffic_filtering=True,id=93bc43c2-a00a-4c71-a3e9-b7b0306969c9,network=Network(1eee62de-b98b-4359-b27b-63fb0219f31c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93bc43c2-a0')
Jan 31 01:47:53 np0005603500 nova_compute[182934]: 2026-01-31 06:47:53.679 182938 DEBUG nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:47:53 np0005603500 nova_compute[182934]: 2026-01-31 06:47:53.680 182938 DEBUG nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:47:53 np0005603500 nova_compute[182934]: 2026-01-31 06:47:53.680 182938 DEBUG nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No VIF found with MAC fa:16:3e:b3:c9:98, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Jan 31 01:47:53 np0005603500 nova_compute[182934]: 2026-01-31 06:47:53.681 182938 INFO nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Using config drive
Jan 31 01:47:53 np0005603500 nova_compute[182934]: 2026-01-31 06:47:53.899 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:55 np0005603500 nova_compute[182934]: 2026-01-31 06:47:55.018 182938 INFO nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Creating config drive at /var/lib/nova/instances/19b9866a-ffdf-4074-b605-12988cf688fa/disk.config
Jan 31 01:47:55 np0005603500 nova_compute[182934]: 2026-01-31 06:47:55.023 182938 DEBUG oslo_concurrency.processutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/19b9866a-ffdf-4074-b605-12988cf688fa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmprzldgfxe execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:47:55 np0005603500 podman[218829]: 2026-01-31 06:47:55.14488109 +0000 UTC m=+0.053623204 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 31 01:47:55 np0005603500 podman[218828]: 2026-01-31 06:47:55.14489901 +0000 UTC m=+0.060577145 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 31 01:47:55 np0005603500 nova_compute[182934]: 2026-01-31 06:47:55.144 182938 DEBUG oslo_concurrency.processutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/19b9866a-ffdf-4074-b605-12988cf688fa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmprzldgfxe" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:47:55 np0005603500 kernel: tap93bc43c2-a0: entered promiscuous mode
Jan 31 01:47:55 np0005603500 NetworkManager[55506]: <info>  [1769842075.1885] manager: (tap93bc43c2-a0): new Tun device (/org/freedesktop/NetworkManager/Devices/85)
Jan 31 01:47:55 np0005603500 nova_compute[182934]: 2026-01-31 06:47:55.190 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:55 np0005603500 ovn_controller[95398]: 2026-01-31T06:47:55Z|00173|binding|INFO|Claiming lport 93bc43c2-a00a-4c71-a3e9-b7b0306969c9 for this chassis.
Jan 31 01:47:55 np0005603500 ovn_controller[95398]: 2026-01-31T06:47:55Z|00174|binding|INFO|93bc43c2-a00a-4c71-a3e9-b7b0306969c9: Claiming fa:16:3e:b3:c9:98 10.100.0.6
Jan 31 01:47:55 np0005603500 ovn_controller[95398]: 2026-01-31T06:47:55Z|00175|binding|INFO|Setting lport 93bc43c2-a00a-4c71-a3e9-b7b0306969c9 ovn-installed in OVS
Jan 31 01:47:55 np0005603500 ovn_controller[95398]: 2026-01-31T06:47:55Z|00176|binding|INFO|Setting lport 93bc43c2-a00a-4c71-a3e9-b7b0306969c9 up in Southbound
Jan 31 01:47:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:55.198 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:c9:98 10.100.0.6'], port_security=['fa:16:3e:b3:c9:98 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '19b9866a-ffdf-4074-b605-12988cf688fa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1eee62de-b98b-4359-b27b-63fb0219f31c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '66391078-8c58-49e3-825a-98ebccd2d2d1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=055d2587-07d8-48d2-bdce-e8f3e4584c68, chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=93bc43c2-a00a-4c71-a3e9-b7b0306969c9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:47:55 np0005603500 nova_compute[182934]: 2026-01-31 06:47:55.200 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:55.200 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 93bc43c2-a00a-4c71-a3e9-b7b0306969c9 in datapath 1eee62de-b98b-4359-b27b-63fb0219f31c bound to our chassis
Jan 31 01:47:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:55.205 104644 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1eee62de-b98b-4359-b27b-63fb0219f31c
Jan 31 01:47:55 np0005603500 systemd-udevd[218888]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:47:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:55.217 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[2b6f395d-ab47-4901-a477-c2a57e8ff31f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:55 np0005603500 systemd-machined[154375]: New machine qemu-12-instance-0000000c.
Jan 31 01:47:55 np0005603500 NetworkManager[55506]: <info>  [1769842075.2302] device (tap93bc43c2-a0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:47:55 np0005603500 NetworkManager[55506]: <info>  [1769842075.2316] device (tap93bc43c2-a0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 01:47:55 np0005603500 systemd[1]: Started Virtual Machine qemu-12-instance-0000000c.
Jan 31 01:47:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:55.239 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[7fbbd857-925f-4d91-82a4-61e918b9a359]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:55.244 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[280dbeab-444a-4104-9513-8f641e5303ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:55.269 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[07ded59a-1c9d-482b-9b3a-0608432583f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:55.284 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[d7ba584a-0755-4148-b21f-d287f35bc9ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1eee62de-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:54:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434399, 'reachable_time': 44609, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218900, 'error': None, 'target': 'ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:55.300 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[6fdf8e73-e997-44fe-96b1-24f930f09d4f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1eee62de-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434407, 'tstamp': 434407}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218901, 'error': None, 'target': 'ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1eee62de-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434409, 'tstamp': 434409}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218901, 'error': None, 'target': 'ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:55.302 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1eee62de-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:47:55 np0005603500 nova_compute[182934]: 2026-01-31 06:47:55.304 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:55 np0005603500 nova_compute[182934]: 2026-01-31 06:47:55.305 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:55.306 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1eee62de-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:47:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:55.306 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:47:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:55.306 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1eee62de-b0, col_values=(('external_ids', {'iface-id': '060d8bd6-d243-4a7c-b3cb-d0f6dfd68585'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:47:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:55.306 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:47:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:55.308 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[084fb6f7-5983-42fd-8cf8-8d72844beec5]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-1eee62de-b98b-4359-b27b-63fb0219f31c\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/1eee62de-b98b-4359-b27b-63fb0219f31c.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 1eee62de-b98b-4359-b27b-63fb0219f31c\n') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:47:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:55.493 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:47:55 np0005603500 nova_compute[182934]: 2026-01-31 06:47:55.493 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:55 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:55.494 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:47:55 np0005603500 nova_compute[182934]: 2026-01-31 06:47:55.565 182938 DEBUG nova.compute.manager [req-32caf8d7-1e3c-449c-85f4-384414d995ca req-35da2c89-864d-4e41-95ee-9ae22ba08737 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Received event network-vif-plugged-93bc43c2-a00a-4c71-a3e9-b7b0306969c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:47:55 np0005603500 nova_compute[182934]: 2026-01-31 06:47:55.565 182938 DEBUG oslo_concurrency.lockutils [req-32caf8d7-1e3c-449c-85f4-384414d995ca req-35da2c89-864d-4e41-95ee-9ae22ba08737 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "19b9866a-ffdf-4074-b605-12988cf688fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:47:55 np0005603500 nova_compute[182934]: 2026-01-31 06:47:55.566 182938 DEBUG oslo_concurrency.lockutils [req-32caf8d7-1e3c-449c-85f4-384414d995ca req-35da2c89-864d-4e41-95ee-9ae22ba08737 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "19b9866a-ffdf-4074-b605-12988cf688fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:47:55 np0005603500 nova_compute[182934]: 2026-01-31 06:47:55.566 182938 DEBUG oslo_concurrency.lockutils [req-32caf8d7-1e3c-449c-85f4-384414d995ca req-35da2c89-864d-4e41-95ee-9ae22ba08737 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "19b9866a-ffdf-4074-b605-12988cf688fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:47:55 np0005603500 nova_compute[182934]: 2026-01-31 06:47:55.566 182938 DEBUG nova.compute.manager [req-32caf8d7-1e3c-449c-85f4-384414d995ca req-35da2c89-864d-4e41-95ee-9ae22ba08737 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Processing event network-vif-plugged-93bc43c2-a00a-4c71-a3e9-b7b0306969c9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Jan 31 01:47:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:56.337 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:47:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:56.338 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:47:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:47:56.339 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:47:56 np0005603500 nova_compute[182934]: 2026-01-31 06:47:56.571 182938 DEBUG nova.compute.manager [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Jan 31 01:47:56 np0005603500 nova_compute[182934]: 2026-01-31 06:47:56.576 182938 DEBUG nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Jan 31 01:47:56 np0005603500 nova_compute[182934]: 2026-01-31 06:47:56.580 182938 INFO nova.virt.libvirt.driver [-] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Instance spawned successfully.
Jan 31 01:47:56 np0005603500 nova_compute[182934]: 2026-01-31 06:47:56.580 182938 DEBUG nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Jan 31 01:47:57 np0005603500 nova_compute[182934]: 2026-01-31 06:47:57.095 182938 DEBUG nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:47:57 np0005603500 nova_compute[182934]: 2026-01-31 06:47:57.096 182938 DEBUG nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:47:57 np0005603500 nova_compute[182934]: 2026-01-31 06:47:57.096 182938 DEBUG nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:47:57 np0005603500 nova_compute[182934]: 2026-01-31 06:47:57.097 182938 DEBUG nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:47:57 np0005603500 nova_compute[182934]: 2026-01-31 06:47:57.098 182938 DEBUG nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:47:57 np0005603500 nova_compute[182934]: 2026-01-31 06:47:57.098 182938 DEBUG nova.virt.libvirt.driver [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:47:57 np0005603500 nova_compute[182934]: 2026-01-31 06:47:57.124 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:47:57 np0005603500 nova_compute[182934]: 2026-01-31 06:47:57.613 182938 INFO nova.compute.manager [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Took 15.00 seconds to spawn the instance on the hypervisor.
Jan 31 01:47:57 np0005603500 nova_compute[182934]: 2026-01-31 06:47:57.615 182938 DEBUG nova.compute.manager [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Jan 31 01:47:57 np0005603500 nova_compute[182934]: 2026-01-31 06:47:57.843 182938 DEBUG nova.compute.manager [req-f801653a-3f89-421d-b605-124439e4c45a req-e0da8a3a-eb1c-4217-8a6e-83e3b96959c6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Received event network-vif-plugged-93bc43c2-a00a-4c71-a3e9-b7b0306969c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:47:57 np0005603500 nova_compute[182934]: 2026-01-31 06:47:57.844 182938 DEBUG oslo_concurrency.lockutils [req-f801653a-3f89-421d-b605-124439e4c45a req-e0da8a3a-eb1c-4217-8a6e-83e3b96959c6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "19b9866a-ffdf-4074-b605-12988cf688fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:47:57 np0005603500 nova_compute[182934]: 2026-01-31 06:47:57.844 182938 DEBUG oslo_concurrency.lockutils [req-f801653a-3f89-421d-b605-124439e4c45a req-e0da8a3a-eb1c-4217-8a6e-83e3b96959c6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "19b9866a-ffdf-4074-b605-12988cf688fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:47:57 np0005603500 nova_compute[182934]: 2026-01-31 06:47:57.845 182938 DEBUG oslo_concurrency.lockutils [req-f801653a-3f89-421d-b605-124439e4c45a req-e0da8a3a-eb1c-4217-8a6e-83e3b96959c6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "19b9866a-ffdf-4074-b605-12988cf688fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:47:57 np0005603500 nova_compute[182934]: 2026-01-31 06:47:57.845 182938 DEBUG nova.compute.manager [req-f801653a-3f89-421d-b605-124439e4c45a req-e0da8a3a-eb1c-4217-8a6e-83e3b96959c6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] No waiting events found dispatching network-vif-plugged-93bc43c2-a00a-4c71-a3e9-b7b0306969c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:47:57 np0005603500 nova_compute[182934]: 2026-01-31 06:47:57.845 182938 WARNING nova.compute.manager [req-f801653a-3f89-421d-b605-124439e4c45a req-e0da8a3a-eb1c-4217-8a6e-83e3b96959c6 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Received unexpected event network-vif-plugged-93bc43c2-a00a-4c71-a3e9-b7b0306969c9 for instance with vm_state active and task_state None.
Jan 31 01:47:58 np0005603500 nova_compute[182934]: 2026-01-31 06:47:58.135 182938 INFO nova.compute.manager [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Took 20.20 seconds to build instance.
Jan 31 01:47:58 np0005603500 nova_compute[182934]: 2026-01-31 06:47:58.642 182938 DEBUG oslo_concurrency.lockutils [None req-1d467fef-7042-4302-8f9f-904f3f63f6fd dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "19b9866a-ffdf-4074-b605-12988cf688fa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.718s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:47:58 np0005603500 nova_compute[182934]: 2026-01-31 06:47:58.901 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:02 np0005603500 nova_compute[182934]: 2026-01-31 06:48:02.127 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:03 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:03.495 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:48:03 np0005603500 nova_compute[182934]: 2026-01-31 06:48:03.902 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:05 np0005603500 nova_compute[182934]: 2026-01-31 06:48:05.810 182938 DEBUG nova.compute.manager [req-58d1426c-45c5-4f76-8157-5bfda68eed8b req-85ca8fbe-594e-47bf-ada7-1869f47123c2 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Received event network-changed-93bc43c2-a00a-4c71-a3e9-b7b0306969c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:48:05 np0005603500 nova_compute[182934]: 2026-01-31 06:48:05.811 182938 DEBUG nova.compute.manager [req-58d1426c-45c5-4f76-8157-5bfda68eed8b req-85ca8fbe-594e-47bf-ada7-1869f47123c2 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Refreshing instance network info cache due to event network-changed-93bc43c2-a00a-4c71-a3e9-b7b0306969c9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:48:05 np0005603500 nova_compute[182934]: 2026-01-31 06:48:05.812 182938 DEBUG oslo_concurrency.lockutils [req-58d1426c-45c5-4f76-8157-5bfda68eed8b req-85ca8fbe-594e-47bf-ada7-1869f47123c2 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-19b9866a-ffdf-4074-b605-12988cf688fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:48:05 np0005603500 nova_compute[182934]: 2026-01-31 06:48:05.812 182938 DEBUG oslo_concurrency.lockutils [req-58d1426c-45c5-4f76-8157-5bfda68eed8b req-85ca8fbe-594e-47bf-ada7-1869f47123c2 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-19b9866a-ffdf-4074-b605-12988cf688fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:48:05 np0005603500 nova_compute[182934]: 2026-01-31 06:48:05.812 182938 DEBUG nova.network.neutron [req-58d1426c-45c5-4f76-8157-5bfda68eed8b req-85ca8fbe-594e-47bf-ada7-1869f47123c2 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Refreshing network info cache for port 93bc43c2-a00a-4c71-a3e9-b7b0306969c9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:48:07 np0005603500 nova_compute[182934]: 2026-01-31 06:48:07.132 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:08 np0005603500 nova_compute[182934]: 2026-01-31 06:48:08.905 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:09 np0005603500 podman[218926]: 2026-01-31 06:48:09.140338471 +0000 UTC m=+0.051453195 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:48:09 np0005603500 podman[218925]: 2026-01-31 06:48:09.14185017 +0000 UTC m=+0.054802863 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 31 01:48:09 np0005603500 ovn_controller[95398]: 2026-01-31T06:48:09Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b3:c9:98 10.100.0.6
Jan 31 01:48:09 np0005603500 ovn_controller[95398]: 2026-01-31T06:48:09Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b3:c9:98 10.100.0.6
Jan 31 01:48:11 np0005603500 nova_compute[182934]: 2026-01-31 06:48:11.068 182938 DEBUG nova.network.neutron [req-58d1426c-45c5-4f76-8157-5bfda68eed8b req-85ca8fbe-594e-47bf-ada7-1869f47123c2 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Updated VIF entry in instance network info cache for port 93bc43c2-a00a-4c71-a3e9-b7b0306969c9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:48:11 np0005603500 nova_compute[182934]: 2026-01-31 06:48:11.068 182938 DEBUG nova.network.neutron [req-58d1426c-45c5-4f76-8157-5bfda68eed8b req-85ca8fbe-594e-47bf-ada7-1869f47123c2 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Updating instance_info_cache with network_info: [{"id": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "address": "fa:16:3e:b3:c9:98", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93bc43c2-a0", "ovs_interfaceid": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:48:11 np0005603500 nova_compute[182934]: 2026-01-31 06:48:11.581 182938 DEBUG oslo_concurrency.lockutils [req-58d1426c-45c5-4f76-8157-5bfda68eed8b req-85ca8fbe-594e-47bf-ada7-1869f47123c2 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-19b9866a-ffdf-4074-b605-12988cf688fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:48:12 np0005603500 nova_compute[182934]: 2026-01-31 06:48:12.136 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:13 np0005603500 nova_compute[182934]: 2026-01-31 06:48:13.907 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:17 np0005603500 nova_compute[182934]: 2026-01-31 06:48:17.137 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f199f44d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:17.990 16 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/servers/5b239873-eca6-4fdc-b15d-2801c75cafa9 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}9de33c3c4c813c7413c734743528a34030291a616c281269e5092e293b0fad44" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:580
Jan 31 01:48:18 np0005603500 podman[218964]: 2026-01-31 06:48:18.129433702 +0000 UTC m=+0.048498351 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.7, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 31 01:48:18 np0005603500 podman[218963]: 2026-01-31 06:48:18.141303279 +0000 UTC m=+0.062701483 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 01:48:18 np0005603500 nova_compute[182934]: 2026-01-31 06:48:18.909 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:19.062 16 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1973 Content-Type: application/json Date: Sat, 31 Jan 2026 06:48:18 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-dd8c117b-8b67-4a36-8597-425b4ccd91d4 x-openstack-request-id: req-dd8c117b-8b67-4a36-8597-425b4ccd91d4 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:621
Jan 31 01:48:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:19.062 16 DEBUG novaclient.v2.client [-] RESP BODY: {"server": {"id": "5b239873-eca6-4fdc-b15d-2801c75cafa9", "name": "tempest-TestNetworkBasicOps-server-794782075", "status": "ACTIVE", "tenant_id": "829310cd8381494e96216dba067ff8d3", "user_id": "dddc34b0385a49a5bd9bf081ed29e9fd", "metadata": {}, "hostId": "0c6cfdf0627941602de15b61ce73eab761f53a1d9b2a5d92c8bbcc8e", "image": {"id": "9f613975-b701-42a0-9b35-7d5c4a2cb7f2", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/images/9f613975-b701-42a0-9b35-7d5c4a2cb7f2"}]}, "flavor": {"id": "9956992e-a3ca-497f-9747-3ae270e07def", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/9956992e-a3ca-497f-9747-3ae270e07def"}]}, "created": "2026-01-31T06:46:50Z", "updated": "2026-01-31T06:47:14Z", "addresses": {"tempest-network-smoke--1900567584": [{"version": 4, "addr": "10.100.0.7", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:a7:e9:1e"}, {"version": 4, "addr": "192.168.122.234", "OS-EXT-IPS:type": "floating", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:a7:e9:1e"}]}, "accessIPv4": "", "accessIPv6": "", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/servers/5b239873-eca6-4fdc-b15d-2801c75cafa9"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/servers/5b239873-eca6-4fdc-b15d-2801c75cafa9"}], "OS-DCF:diskConfig": "MANUAL", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "True", "key_name": "tempest-TestNetworkBasicOps-248708636", "OS-SRV-USG:launched_at": "2026-01-31T06:47:14.000000", "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "tempest-secgroup-smoke-132872882"}], "OS-EXT-SRV-ATTR:host": "compute-0.ctlplane.example.com", "OS-EXT-SRV-ATTR:instance_name": "instance-0000000b", "OS-EXT-SRV-ATTR:hypervisor_hostname": "compute-0.ctlplane.example.com", "OS-EXT-STS:task_state": null, "OS-EXT-STS:vm_state": "active", "OS-EXT-STS:power_state": 1, "os-extended-volumes:volumes_attached": []}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:656
Jan 31 01:48:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:19.063 16 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/servers/5b239873-eca6-4fdc-b15d-2801c75cafa9 used request id req-dd8c117b-8b67-4a36-8597-425b4ccd91d4 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:1081
Jan 31 01:48:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:19.063 16 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5b239873-eca6-4fdc-b15d-2801c75cafa9', 'name': 'tempest-TestNetworkBasicOps-server-794782075', 'flavor': {'id': '9956992e-a3ca-497f-9747-3ae270e07def', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9f613975-b701-42a0-9b35-7d5c4a2cb7f2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '829310cd8381494e96216dba067ff8d3', 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'hostId': '0c6cfdf0627941602de15b61ce73eab761f53a1d9b2a5d92c8bbcc8e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:226
Jan 31 01:48:19 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:19.066 16 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/servers/19b9866a-ffdf-4074-b605-12988cf688fa -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}9de33c3c4c813c7413c734743528a34030291a616c281269e5092e293b0fad44" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:580
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.070 16 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1973 Content-Type: application/json Date: Sat, 31 Jan 2026 06:48:19 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-c09035b8-6946-429c-99a3-86caf8c13d0e x-openstack-request-id: req-c09035b8-6946-429c-99a3-86caf8c13d0e _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:621
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.070 16 DEBUG novaclient.v2.client [-] RESP BODY: {"server": {"id": "19b9866a-ffdf-4074-b605-12988cf688fa", "name": "tempest-TestNetworkBasicOps-server-261222791", "status": "ACTIVE", "tenant_id": "829310cd8381494e96216dba067ff8d3", "user_id": "dddc34b0385a49a5bd9bf081ed29e9fd", "metadata": {}, "hostId": "0c6cfdf0627941602de15b61ce73eab761f53a1d9b2a5d92c8bbcc8e", "image": {"id": "9f613975-b701-42a0-9b35-7d5c4a2cb7f2", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/images/9f613975-b701-42a0-9b35-7d5c4a2cb7f2"}]}, "flavor": {"id": "9956992e-a3ca-497f-9747-3ae270e07def", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/9956992e-a3ca-497f-9747-3ae270e07def"}]}, "created": "2026-01-31T06:47:34Z", "updated": "2026-01-31T06:47:57Z", "addresses": {"tempest-network-smoke--1900567584": [{"version": 4, "addr": "10.100.0.6", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:b3:c9:98"}, {"version": 4, "addr": "192.168.122.197", "OS-EXT-IPS:type": "floating", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:b3:c9:98"}]}, "accessIPv4": "", "accessIPv6": "", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/servers/19b9866a-ffdf-4074-b605-12988cf688fa"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/servers/19b9866a-ffdf-4074-b605-12988cf688fa"}], "OS-DCF:diskConfig": "MANUAL", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "True", "key_name": "tempest-TestNetworkBasicOps-133039895", "OS-SRV-USG:launched_at": "2026-01-31T06:47:57.000000", "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "tempest-secgroup-smoke-615024026"}], "OS-EXT-SRV-ATTR:host": "compute-0.ctlplane.example.com", "OS-EXT-SRV-ATTR:instance_name": "instance-0000000c", "OS-EXT-SRV-ATTR:hypervisor_hostname": "compute-0.ctlplane.example.com", "OS-EXT-STS:task_state": null, "OS-EXT-STS:vm_state": "active", "OS-EXT-STS:power_state": 1, "os-extended-volumes:volumes_attached": []}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:656
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.070 16 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/servers/19b9866a-ffdf-4074-b605-12988cf688fa used request id req-c09035b8-6946-429c-99a3-86caf8c13d0e request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:1081
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.071 16 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '19b9866a-ffdf-4074-b605-12988cf688fa', 'name': 'tempest-TestNetworkBasicOps-server-261222791', 'flavor': {'id': '9956992e-a3ca-497f-9747-3ae270e07def', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9f613975-b701-42a0-9b35-7d5c4a2cb7f2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '829310cd8381494e96216dba067ff8d3', 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'hostId': '0c6cfdf0627941602de15b61ce73eab761f53a1d9b2a5d92c8bbcc8e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:226
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.072 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.072 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d7f0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.072 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d7f0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.072 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.073 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets (2026-01-31T06:48:21.072786) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.076 16 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 5b239873-eca6-4fdc-b15d-2801c75cafa9 / tap3c1979fb-e9 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.076 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.080 16 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 19b9866a-ffdf-4074-b605-12988cf688fa / tap93bc43c2-a0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.080 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/network.outgoing.packets volume: 17 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.081 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.081 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f199f44d760>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.081 16 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.081 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d400>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.081 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d400>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.081 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: memory.usage heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.082 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for memory.usage (2026-01-31T06:48:21.081920) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.093 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/memory.usage volume: 42.6875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.105 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/memory.usage volume: 42.43359375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.106 16 INFO ceilometer.polling.manager [-] Finished polling pollster memory.usage in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.106 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f199f43b700>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.106 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.107 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b610>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.107 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b610>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.107 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.107 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.bytes (2026-01-31T06:48:21.107206) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.131 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.device.read.bytes volume: 31001088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.132 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.device.read.bytes volume: 284990 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.151 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/disk.device.read.bytes volume: 30108160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.152 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/disk.device.read.bytes volume: 284990 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.153 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.bytes in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.153 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f199f43bdf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.153 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.153 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43be80>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.153 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43be80>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.154 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.latency heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.154 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.latency (2026-01-31T06:48:21.154061) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.155 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.device.write.latency volume: 2600830486 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.155 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.156 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/disk.device.write.latency volume: 2612034904 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.156 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.156 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.latency in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.156 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f199f44d6a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.157 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.157 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d850>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.157 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d850>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.157 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.rate (2026-01-31T06:48:21.157442) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.157 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.rate heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.157 16 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.158 16 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-794782075>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-261222791>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-794782075>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-261222791>]
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.158 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f199f43b3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.158 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.158 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b460>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.158 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b460>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.158 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.requests heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.159 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.requests (2026-01-31T06:48:21.158921) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.159 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.device.read.requests volume: 1131 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.159 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.device.read.requests volume: 113 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.159 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/disk.device.read.requests volume: 1083 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.160 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/disk.device.read.requests volume: 113 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.160 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.requests in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.160 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f199f43b340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.160 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.161 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b9d0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.161 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b9d0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.161 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.capacity heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.161 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.capacity (2026-01-31T06:48:21.161312) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.169 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.170 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.device.capacity volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.180 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.181 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/disk.device.capacity volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.182 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.capacity in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.182 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f199f44dc10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.182 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.182 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44dca0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.183 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44dca0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.183 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.rate heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.183 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.rate (2026-01-31T06:48:21.183152) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.184 16 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.184 16 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-794782075>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-261222791>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-794782075>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-261222791>]
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.184 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f199f44d3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.184 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.184 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d5e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.184 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d5e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.184 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.185 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes (2026-01-31T06:48:21.184846) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.185 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/network.incoming.bytes volume: 1520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.185 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/network.incoming.bytes volume: 1758 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.186 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.186 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f199f44d940>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.186 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.186 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d1f0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.186 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d1f0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.186 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.delta (2026-01-31T06:48:21.186689) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.186 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.delta heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.187 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.187 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.187 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.187 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f199f44db50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.187 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.187 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44dbe0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.188 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44dbe0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.188 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.delta (2026-01-31T06:48:21.188146) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.188 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.delta heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.188 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.188 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.189 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.189 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f199f44d220>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.189 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.189 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d130>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.189 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d130>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.189 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.189 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.189 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes (2026-01-31T06:48:21.189792) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.190 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/network.outgoing.bytes volume: 1666 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.190 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.190 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f199f451250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.190 16 INFO ceilometer.polling.manager [-] Polling pollster power.state in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.190 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f4512e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.191 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f4512e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.191 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: power.state heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.191 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/power.state volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.191 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for power.state (2026-01-31T06:48:21.191160) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.191 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/power.state volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.192 16 INFO ceilometer.polling.manager [-] Finished polling pollster power.state in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.192 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f199f44dcd0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.192 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.192 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44dd60>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.193 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44dd60>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.193 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.193 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.193 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets (2026-01-31T06:48:21.193158) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.193 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/network.incoming.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.194 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.194 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f199f43b0d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.194 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.194 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b6d0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.194 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b6d0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.194 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.latency heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.194 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.latency (2026-01-31T06:48:21.194747) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.194 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.device.read.latency volume: 870138752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.195 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.device.read.latency volume: 70238392 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.195 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/disk.device.read.latency volume: 802832624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.195 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/disk.device.read.latency volume: 63025679 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.196 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.latency in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.196 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f19a53f3b80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.196 16 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.196 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f436ee0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.196 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f436ee0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.196 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: cpu heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.197 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/cpu volume: 11320000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.197 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for cpu (2026-01-31T06:48:21.196905) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.197 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/cpu volume: 11190000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.197 16 INFO ceilometer.polling.manager [-] Finished polling pollster cpu in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.197 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f199f44d160>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.197 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.198 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d0a0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.198 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d0a0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.198 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.error heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.198 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.error (2026-01-31T06:48:21.198241) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.198 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.198 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.199 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.error in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.199 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f199f43bca0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.199 16 INFO ceilometer.polling.manager [-] Polling pollster disk.root.size in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.199 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43bbb0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.199 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43bbb0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.199 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.root.size heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.199 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.root.size (2026-01-31T06:48:21.199663) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.200 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.root.size in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.200 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f199f43bbe0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.200 16 INFO ceilometer.polling.manager [-] Polling pollster disk.ephemeral.size in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.200 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b2e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.200 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b2e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.200 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.ephemeral.size heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.201 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.ephemeral.size (2026-01-31T06:48:21.200880) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.201 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.ephemeral.size in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.201 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f199f43b550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.201 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.201 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b5e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.201 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b5e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.202 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.202 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.bytes (2026-01-31T06:48:21.202047) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.202 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.device.write.bytes volume: 73060352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.202 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.202 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/disk.device.write.bytes volume: 72953856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.203 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.203 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.bytes in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.203 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f199f43b490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.203 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.203 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b520>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.203 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b520>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.203 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.usage heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.204 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.usage (2026-01-31T06:48:21.203930) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.204 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.204 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.device.usage volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.204 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.204 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/disk.device.usage volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.205 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.usage in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.205 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f199f44d2e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.205 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.205 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d3a0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.205 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d3a0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.205 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.drop heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.205 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.drop (2026-01-31T06:48:21.205786) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.205 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.206 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.206 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.206 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f199f44d040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.206 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.206 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44de20>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.207 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44de20>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.207 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.drop heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.207 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.drop (2026-01-31T06:48:21.207107) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.207 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.207 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.208 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.drop in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.208 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f199f436bb0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.208 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.208 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43ba60>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.208 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43ba60>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.208 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.allocation heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.208 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.allocation (2026-01-31T06:48:21.208489) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.208 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.209 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.device.allocation volume: 499712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.209 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.209 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/disk.device.allocation volume: 499712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.209 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.allocation in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.210 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f199f44d4f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.210 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.210 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f44d580>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.210 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f44d580>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.210 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.error heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.210 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.error (2026-01-31T06:48:21.210438) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.210 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.210 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.211 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.error in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.211 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f199f43baf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.211 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.211 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f199f43b400>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.211 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f199f43b400>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.211 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.requests heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.212 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.requests (2026-01-31T06:48:21.211915) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.212 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.device.write.requests volume: 334 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.212 16 DEBUG ceilometer.compute.pollsters [-] 5b239873-eca6-4fdc-b15d-2801c75cafa9/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.212 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/disk.device.write.requests volume: 313 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.212 16 DEBUG ceilometer.compute.pollsters [-] 19b9866a-ffdf-4074-b605-12988cf688fa/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 31 01:48:21 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:48:21.213 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.requests in the context of pollsters
Jan 31 01:48:22 np0005603500 nova_compute[182934]: 2026-01-31 06:48:22.140 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:22 np0005603500 nova_compute[182934]: 2026-01-31 06:48:22.372 182938 INFO nova.compute.manager [None req-07c750c8-5c39-499b-9ff7-a918ece044b8 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Get console output
Jan 31 01:48:22 np0005603500 nova_compute[182934]: 2026-01-31 06:48:22.376 211654 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 31 01:48:23 np0005603500 nova_compute[182934]: 2026-01-31 06:48:23.830 182938 DEBUG nova.compute.manager [req-874f8a1b-6e7f-4100-a1df-d06886777078 req-d4d95361-dd4a-4743-8a13-33801993613d 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received event network-changed-3c1979fb-e961-4676-b6c2-2ca71f2d859b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:48:23 np0005603500 nova_compute[182934]: 2026-01-31 06:48:23.831 182938 DEBUG nova.compute.manager [req-874f8a1b-6e7f-4100-a1df-d06886777078 req-d4d95361-dd4a-4743-8a13-33801993613d 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Refreshing instance network info cache due to event network-changed-3c1979fb-e961-4676-b6c2-2ca71f2d859b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:48:23 np0005603500 nova_compute[182934]: 2026-01-31 06:48:23.831 182938 DEBUG oslo_concurrency.lockutils [req-874f8a1b-6e7f-4100-a1df-d06886777078 req-d4d95361-dd4a-4743-8a13-33801993613d 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-5b239873-eca6-4fdc-b15d-2801c75cafa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:48:23 np0005603500 nova_compute[182934]: 2026-01-31 06:48:23.831 182938 DEBUG oslo_concurrency.lockutils [req-874f8a1b-6e7f-4100-a1df-d06886777078 req-d4d95361-dd4a-4743-8a13-33801993613d 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-5b239873-eca6-4fdc-b15d-2801c75cafa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:48:23 np0005603500 nova_compute[182934]: 2026-01-31 06:48:23.831 182938 DEBUG nova.network.neutron [req-874f8a1b-6e7f-4100-a1df-d06886777078 req-d4d95361-dd4a-4743-8a13-33801993613d 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Refreshing network info cache for port 3c1979fb-e961-4676-b6c2-2ca71f2d859b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:48:23 np0005603500 nova_compute[182934]: 2026-01-31 06:48:23.910 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:24 np0005603500 nova_compute[182934]: 2026-01-31 06:48:24.843 182938 INFO nova.compute.manager [None req-ea548bd8-a689-425d-9087-5b71a3c33d54 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Get console output
Jan 31 01:48:24 np0005603500 nova_compute[182934]: 2026-01-31 06:48:24.848 211654 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 31 01:48:25 np0005603500 podman[219013]: 2026-01-31 06:48:25.381292843 +0000 UTC m=+0.042810122 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:48:25 np0005603500 podman[219012]: 2026-01-31 06:48:25.41267189 +0000 UTC m=+0.075222711 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 31 01:48:26 np0005603500 nova_compute[182934]: 2026-01-31 06:48:26.039 182938 DEBUG nova.compute.manager [req-8dec4e65-959d-4a71-8842-0676771ffcd1 req-5f70baf4-3e2f-498d-8765-e749adb80b47 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received event network-vif-unplugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:48:26 np0005603500 nova_compute[182934]: 2026-01-31 06:48:26.040 182938 DEBUG oslo_concurrency.lockutils [req-8dec4e65-959d-4a71-8842-0676771ffcd1 req-5f70baf4-3e2f-498d-8765-e749adb80b47 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:48:26 np0005603500 nova_compute[182934]: 2026-01-31 06:48:26.040 182938 DEBUG oslo_concurrency.lockutils [req-8dec4e65-959d-4a71-8842-0676771ffcd1 req-5f70baf4-3e2f-498d-8765-e749adb80b47 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:48:26 np0005603500 nova_compute[182934]: 2026-01-31 06:48:26.040 182938 DEBUG oslo_concurrency.lockutils [req-8dec4e65-959d-4a71-8842-0676771ffcd1 req-5f70baf4-3e2f-498d-8765-e749adb80b47 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:48:26 np0005603500 nova_compute[182934]: 2026-01-31 06:48:26.040 182938 DEBUG nova.compute.manager [req-8dec4e65-959d-4a71-8842-0676771ffcd1 req-5f70baf4-3e2f-498d-8765-e749adb80b47 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] No waiting events found dispatching network-vif-unplugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:48:26 np0005603500 nova_compute[182934]: 2026-01-31 06:48:26.040 182938 WARNING nova.compute.manager [req-8dec4e65-959d-4a71-8842-0676771ffcd1 req-5f70baf4-3e2f-498d-8765-e749adb80b47 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received unexpected event network-vif-unplugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b for instance with vm_state active and task_state None.
Jan 31 01:48:26 np0005603500 nova_compute[182934]: 2026-01-31 06:48:26.041 182938 DEBUG nova.compute.manager [req-8dec4e65-959d-4a71-8842-0676771ffcd1 req-5f70baf4-3e2f-498d-8765-e749adb80b47 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received event network-vif-plugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:48:26 np0005603500 nova_compute[182934]: 2026-01-31 06:48:26.041 182938 DEBUG oslo_concurrency.lockutils [req-8dec4e65-959d-4a71-8842-0676771ffcd1 req-5f70baf4-3e2f-498d-8765-e749adb80b47 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:48:26 np0005603500 nova_compute[182934]: 2026-01-31 06:48:26.041 182938 DEBUG oslo_concurrency.lockutils [req-8dec4e65-959d-4a71-8842-0676771ffcd1 req-5f70baf4-3e2f-498d-8765-e749adb80b47 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:48:26 np0005603500 nova_compute[182934]: 2026-01-31 06:48:26.041 182938 DEBUG oslo_concurrency.lockutils [req-8dec4e65-959d-4a71-8842-0676771ffcd1 req-5f70baf4-3e2f-498d-8765-e749adb80b47 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:48:26 np0005603500 nova_compute[182934]: 2026-01-31 06:48:26.041 182938 DEBUG nova.compute.manager [req-8dec4e65-959d-4a71-8842-0676771ffcd1 req-5f70baf4-3e2f-498d-8765-e749adb80b47 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] No waiting events found dispatching network-vif-plugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:48:26 np0005603500 nova_compute[182934]: 2026-01-31 06:48:26.041 182938 WARNING nova.compute.manager [req-8dec4e65-959d-4a71-8842-0676771ffcd1 req-5f70baf4-3e2f-498d-8765-e749adb80b47 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received unexpected event network-vif-plugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b for instance with vm_state active and task_state None.
Jan 31 01:48:27 np0005603500 nova_compute[182934]: 2026-01-31 06:48:27.144 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:27 np0005603500 nova_compute[182934]: 2026-01-31 06:48:27.393 182938 DEBUG nova.compute.manager [req-24ad7532-fcec-42eb-a2ae-a5cb9a9d5ae9 req-2afd86a1-d225-4b9b-b2a7-be8706fe88f4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received event network-changed-3c1979fb-e961-4676-b6c2-2ca71f2d859b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:48:27 np0005603500 nova_compute[182934]: 2026-01-31 06:48:27.394 182938 DEBUG nova.compute.manager [req-24ad7532-fcec-42eb-a2ae-a5cb9a9d5ae9 req-2afd86a1-d225-4b9b-b2a7-be8706fe88f4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Refreshing instance network info cache due to event network-changed-3c1979fb-e961-4676-b6c2-2ca71f2d859b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:48:27 np0005603500 nova_compute[182934]: 2026-01-31 06:48:27.394 182938 DEBUG oslo_concurrency.lockutils [req-24ad7532-fcec-42eb-a2ae-a5cb9a9d5ae9 req-2afd86a1-d225-4b9b-b2a7-be8706fe88f4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-5b239873-eca6-4fdc-b15d-2801c75cafa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:48:27 np0005603500 nova_compute[182934]: 2026-01-31 06:48:27.787 182938 INFO nova.compute.manager [None req-f83cfbb5-4a3e-461e-b1d8-a37e8301c98d dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Get console output
Jan 31 01:48:27 np0005603500 nova_compute[182934]: 2026-01-31 06:48:27.792 211654 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 31 01:48:27 np0005603500 nova_compute[182934]: 2026-01-31 06:48:27.805 182938 DEBUG nova.network.neutron [req-874f8a1b-6e7f-4100-a1df-d06886777078 req-d4d95361-dd4a-4743-8a13-33801993613d 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Updated VIF entry in instance network info cache for port 3c1979fb-e961-4676-b6c2-2ca71f2d859b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:48:27 np0005603500 nova_compute[182934]: 2026-01-31 06:48:27.806 182938 DEBUG nova.network.neutron [req-874f8a1b-6e7f-4100-a1df-d06886777078 req-d4d95361-dd4a-4743-8a13-33801993613d 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Updating instance_info_cache with network_info: [{"id": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "address": "fa:16:3e:a7:e9:1e", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c1979fb-e9", "ovs_interfaceid": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:48:28 np0005603500 nova_compute[182934]: 2026-01-31 06:48:28.265 182938 DEBUG nova.compute.manager [req-8dc0280a-0636-4853-8e5c-c290bb9e8fff req-835a9db8-fc40-4f77-9600-009107a2d941 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received event network-vif-plugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:48:28 np0005603500 nova_compute[182934]: 2026-01-31 06:48:28.265 182938 DEBUG oslo_concurrency.lockutils [req-8dc0280a-0636-4853-8e5c-c290bb9e8fff req-835a9db8-fc40-4f77-9600-009107a2d941 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:48:28 np0005603500 nova_compute[182934]: 2026-01-31 06:48:28.266 182938 DEBUG oslo_concurrency.lockutils [req-8dc0280a-0636-4853-8e5c-c290bb9e8fff req-835a9db8-fc40-4f77-9600-009107a2d941 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:48:28 np0005603500 nova_compute[182934]: 2026-01-31 06:48:28.266 182938 DEBUG oslo_concurrency.lockutils [req-8dc0280a-0636-4853-8e5c-c290bb9e8fff req-835a9db8-fc40-4f77-9600-009107a2d941 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:48:28 np0005603500 nova_compute[182934]: 2026-01-31 06:48:28.266 182938 DEBUG nova.compute.manager [req-8dc0280a-0636-4853-8e5c-c290bb9e8fff req-835a9db8-fc40-4f77-9600-009107a2d941 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] No waiting events found dispatching network-vif-plugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:48:28 np0005603500 nova_compute[182934]: 2026-01-31 06:48:28.266 182938 WARNING nova.compute.manager [req-8dc0280a-0636-4853-8e5c-c290bb9e8fff req-835a9db8-fc40-4f77-9600-009107a2d941 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received unexpected event network-vif-plugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b for instance with vm_state active and task_state None.
Jan 31 01:48:28 np0005603500 nova_compute[182934]: 2026-01-31 06:48:28.266 182938 DEBUG nova.compute.manager [req-8dc0280a-0636-4853-8e5c-c290bb9e8fff req-835a9db8-fc40-4f77-9600-009107a2d941 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received event network-vif-plugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:48:28 np0005603500 nova_compute[182934]: 2026-01-31 06:48:28.267 182938 DEBUG oslo_concurrency.lockutils [req-8dc0280a-0636-4853-8e5c-c290bb9e8fff req-835a9db8-fc40-4f77-9600-009107a2d941 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:48:28 np0005603500 nova_compute[182934]: 2026-01-31 06:48:28.267 182938 DEBUG oslo_concurrency.lockutils [req-8dc0280a-0636-4853-8e5c-c290bb9e8fff req-835a9db8-fc40-4f77-9600-009107a2d941 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:48:28 np0005603500 nova_compute[182934]: 2026-01-31 06:48:28.267 182938 DEBUG oslo_concurrency.lockutils [req-8dc0280a-0636-4853-8e5c-c290bb9e8fff req-835a9db8-fc40-4f77-9600-009107a2d941 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:48:28 np0005603500 nova_compute[182934]: 2026-01-31 06:48:28.267 182938 DEBUG nova.compute.manager [req-8dc0280a-0636-4853-8e5c-c290bb9e8fff req-835a9db8-fc40-4f77-9600-009107a2d941 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] No waiting events found dispatching network-vif-plugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:48:28 np0005603500 nova_compute[182934]: 2026-01-31 06:48:28.267 182938 WARNING nova.compute.manager [req-8dc0280a-0636-4853-8e5c-c290bb9e8fff req-835a9db8-fc40-4f77-9600-009107a2d941 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received unexpected event network-vif-plugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b for instance with vm_state active and task_state None.
Jan 31 01:48:28 np0005603500 nova_compute[182934]: 2026-01-31 06:48:28.310 182938 DEBUG oslo_concurrency.lockutils [req-874f8a1b-6e7f-4100-a1df-d06886777078 req-d4d95361-dd4a-4743-8a13-33801993613d 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-5b239873-eca6-4fdc-b15d-2801c75cafa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:48:28 np0005603500 nova_compute[182934]: 2026-01-31 06:48:28.311 182938 DEBUG oslo_concurrency.lockutils [req-24ad7532-fcec-42eb-a2ae-a5cb9a9d5ae9 req-2afd86a1-d225-4b9b-b2a7-be8706fe88f4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-5b239873-eca6-4fdc-b15d-2801c75cafa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:48:28 np0005603500 nova_compute[182934]: 2026-01-31 06:48:28.311 182938 DEBUG nova.network.neutron [req-24ad7532-fcec-42eb-a2ae-a5cb9a9d5ae9 req-2afd86a1-d225-4b9b-b2a7-be8706fe88f4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Refreshing network info cache for port 3c1979fb-e961-4676-b6c2-2ca71f2d859b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:48:28 np0005603500 nova_compute[182934]: 2026-01-31 06:48:28.916 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:30 np0005603500 nova_compute[182934]: 2026-01-31 06:48:30.344 182938 DEBUG oslo_concurrency.lockutils [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "19b9866a-ffdf-4074-b605-12988cf688fa" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:48:30 np0005603500 nova_compute[182934]: 2026-01-31 06:48:30.345 182938 DEBUG oslo_concurrency.lockutils [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "19b9866a-ffdf-4074-b605-12988cf688fa" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:48:30 np0005603500 nova_compute[182934]: 2026-01-31 06:48:30.345 182938 DEBUG oslo_concurrency.lockutils [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "19b9866a-ffdf-4074-b605-12988cf688fa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:48:30 np0005603500 nova_compute[182934]: 2026-01-31 06:48:30.346 182938 DEBUG oslo_concurrency.lockutils [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "19b9866a-ffdf-4074-b605-12988cf688fa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:48:30 np0005603500 nova_compute[182934]: 2026-01-31 06:48:30.346 182938 DEBUG oslo_concurrency.lockutils [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "19b9866a-ffdf-4074-b605-12988cf688fa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:48:30 np0005603500 nova_compute[182934]: 2026-01-31 06:48:30.348 182938 INFO nova.compute.manager [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Terminating instance
Jan 31 01:48:30 np0005603500 nova_compute[182934]: 2026-01-31 06:48:30.465 182938 DEBUG nova.compute.manager [req-43fba860-7914-4372-8a0f-8a5f270a4d0b req-749df8fd-d900-4939-99e5-6e33b2e3dc87 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Received event network-changed-93bc43c2-a00a-4c71-a3e9-b7b0306969c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:48:30 np0005603500 nova_compute[182934]: 2026-01-31 06:48:30.466 182938 DEBUG nova.compute.manager [req-43fba860-7914-4372-8a0f-8a5f270a4d0b req-749df8fd-d900-4939-99e5-6e33b2e3dc87 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Refreshing instance network info cache due to event network-changed-93bc43c2-a00a-4c71-a3e9-b7b0306969c9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:48:30 np0005603500 nova_compute[182934]: 2026-01-31 06:48:30.467 182938 DEBUG oslo_concurrency.lockutils [req-43fba860-7914-4372-8a0f-8a5f270a4d0b req-749df8fd-d900-4939-99e5-6e33b2e3dc87 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-19b9866a-ffdf-4074-b605-12988cf688fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:48:30 np0005603500 nova_compute[182934]: 2026-01-31 06:48:30.467 182938 DEBUG oslo_concurrency.lockutils [req-43fba860-7914-4372-8a0f-8a5f270a4d0b req-749df8fd-d900-4939-99e5-6e33b2e3dc87 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-19b9866a-ffdf-4074-b605-12988cf688fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:48:30 np0005603500 nova_compute[182934]: 2026-01-31 06:48:30.468 182938 DEBUG nova.network.neutron [req-43fba860-7914-4372-8a0f-8a5f270a4d0b req-749df8fd-d900-4939-99e5-6e33b2e3dc87 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Refreshing network info cache for port 93bc43c2-a00a-4c71-a3e9-b7b0306969c9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:48:30 np0005603500 nova_compute[182934]: 2026-01-31 06:48:30.859 182938 DEBUG nova.compute.manager [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Jan 31 01:48:30 np0005603500 kernel: tap93bc43c2-a0 (unregistering): left promiscuous mode
Jan 31 01:48:30 np0005603500 NetworkManager[55506]: <info>  [1769842110.8860] device (tap93bc43c2-a0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 01:48:30 np0005603500 ovn_controller[95398]: 2026-01-31T06:48:30Z|00177|binding|INFO|Releasing lport 93bc43c2-a00a-4c71-a3e9-b7b0306969c9 from this chassis (sb_readonly=0)
Jan 31 01:48:30 np0005603500 ovn_controller[95398]: 2026-01-31T06:48:30Z|00178|binding|INFO|Setting lport 93bc43c2-a00a-4c71-a3e9-b7b0306969c9 down in Southbound
Jan 31 01:48:30 np0005603500 nova_compute[182934]: 2026-01-31 06:48:30.892 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:30 np0005603500 ovn_controller[95398]: 2026-01-31T06:48:30Z|00179|binding|INFO|Removing iface tap93bc43c2-a0 ovn-installed in OVS
Jan 31 01:48:30 np0005603500 nova_compute[182934]: 2026-01-31 06:48:30.893 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:30 np0005603500 nova_compute[182934]: 2026-01-31 06:48:30.901 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:30.903 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:c9:98 10.100.0.6'], port_security=['fa:16:3e:b3:c9:98 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '19b9866a-ffdf-4074-b605-12988cf688fa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1eee62de-b98b-4359-b27b-63fb0219f31c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '66391078-8c58-49e3-825a-98ebccd2d2d1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=055d2587-07d8-48d2-bdce-e8f3e4584c68, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=93bc43c2-a00a-4c71-a3e9-b7b0306969c9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:48:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:30.904 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 93bc43c2-a00a-4c71-a3e9-b7b0306969c9 in datapath 1eee62de-b98b-4359-b27b-63fb0219f31c unbound from our chassis
Jan 31 01:48:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:30.906 104644 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1eee62de-b98b-4359-b27b-63fb0219f31c
Jan 31 01:48:30 np0005603500 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 31 01:48:30 np0005603500 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Consumed 14.032s CPU time.
Jan 31 01:48:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:30.922 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[e7fe9161-4d88-4f3e-b0eb-e5bca533d12a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:48:30 np0005603500 systemd-machined[154375]: Machine qemu-12-instance-0000000c terminated.
Jan 31 01:48:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:30.939 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[5be7e963-9c0a-4efc-9aaf-56cf74d59abc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:48:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:30.941 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[29863f6f-db98-4c16-8bbc-9db1cede3e93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:48:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:30.958 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[2897ea64-8aa5-4a58-814a-137ba44d7f40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:48:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:30.971 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[69b1c2d9-60b9-4f07-b7cd-66e4891bb164]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1eee62de-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:54:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434399, 'reachable_time': 44609, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219066, 'error': None, 'target': 'ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:48:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:30.989 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[616e99c3-4786-49c0-b236-ca2ec007114c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1eee62de-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434407, 'tstamp': 434407}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219067, 'error': None, 'target': 'ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1eee62de-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434409, 'tstamp': 434409}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219067, 'error': None, 'target': 'ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:48:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:30.990 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1eee62de-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:48:30 np0005603500 nova_compute[182934]: 2026-01-31 06:48:30.991 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:30 np0005603500 nova_compute[182934]: 2026-01-31 06:48:30.995 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:30.995 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1eee62de-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:48:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:30.995 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:48:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:30.996 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1eee62de-b0, col_values=(('external_ids', {'iface-id': '060d8bd6-d243-4a7c-b3cb-d0f6dfd68585'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:48:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:30.996 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:48:30 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:30.997 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[54a112a3-734f-4575-b492-e1109a52c5b3]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-1eee62de-b98b-4359-b27b-63fb0219f31c\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/1eee62de-b98b-4359-b27b-63fb0219f31c.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 1eee62de-b98b-4359-b27b-63fb0219f31c\n') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.100 182938 INFO nova.virt.libvirt.driver [-] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Instance destroyed successfully.
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.101 182938 DEBUG nova.objects.instance [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'resources' on Instance uuid 19b9866a-ffdf-4074-b605-12988cf688fa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.384 182938 DEBUG nova.compute.manager [req-e71e93ec-abc4-4e9d-a1ee-5da7d9a22da9 req-6c6118ef-508d-4850-9823-b875c5e1dd76 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Received event network-vif-unplugged-93bc43c2-a00a-4c71-a3e9-b7b0306969c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.385 182938 DEBUG oslo_concurrency.lockutils [req-e71e93ec-abc4-4e9d-a1ee-5da7d9a22da9 req-6c6118ef-508d-4850-9823-b875c5e1dd76 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "19b9866a-ffdf-4074-b605-12988cf688fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.385 182938 DEBUG oslo_concurrency.lockutils [req-e71e93ec-abc4-4e9d-a1ee-5da7d9a22da9 req-6c6118ef-508d-4850-9823-b875c5e1dd76 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "19b9866a-ffdf-4074-b605-12988cf688fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.385 182938 DEBUG oslo_concurrency.lockutils [req-e71e93ec-abc4-4e9d-a1ee-5da7d9a22da9 req-6c6118ef-508d-4850-9823-b875c5e1dd76 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "19b9866a-ffdf-4074-b605-12988cf688fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.385 182938 DEBUG nova.compute.manager [req-e71e93ec-abc4-4e9d-a1ee-5da7d9a22da9 req-6c6118ef-508d-4850-9823-b875c5e1dd76 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] No waiting events found dispatching network-vif-unplugged-93bc43c2-a00a-4c71-a3e9-b7b0306969c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.386 182938 DEBUG nova.compute.manager [req-e71e93ec-abc4-4e9d-a1ee-5da7d9a22da9 req-6c6118ef-508d-4850-9823-b875c5e1dd76 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Received event network-vif-unplugged-93bc43c2-a00a-4c71-a3e9-b7b0306969c9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.608 182938 DEBUG nova.virt.libvirt.vif [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:47:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-261222791',display_name='tempest-TestNetworkBasicOps-server-261222791',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-261222791',id=12,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF5vqaP6nGopzb91aDDJXpx7Ly0Sxm9oF3ehyWfdSEp3B+jf7SAETxAGTmuToCpQrGXRtrRyn0kdZQy8LxdkDc0vxQzNswubsYs/PS1WzAwgpZyYXs3ZFT4RCs1XHbyqYw==',key_name='tempest-TestNetworkBasicOps-133039895',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:47:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-v3fcf8ki',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:47:57Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=19b9866a-ffdf-4074-b605-12988cf688fa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "address": "fa:16:3e:b3:c9:98", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93bc43c2-a0", "ovs_interfaceid": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.610 182938 DEBUG nova.network.os_vif_util [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "address": "fa:16:3e:b3:c9:98", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93bc43c2-a0", "ovs_interfaceid": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.611 182938 DEBUG nova.network.os_vif_util [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b3:c9:98,bridge_name='br-int',has_traffic_filtering=True,id=93bc43c2-a00a-4c71-a3e9-b7b0306969c9,network=Network(1eee62de-b98b-4359-b27b-63fb0219f31c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93bc43c2-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.612 182938 DEBUG os_vif [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b3:c9:98,bridge_name='br-int',has_traffic_filtering=True,id=93bc43c2-a00a-4c71-a3e9-b7b0306969c9,network=Network(1eee62de-b98b-4359-b27b-63fb0219f31c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93bc43c2-a0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.615 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.616 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap93bc43c2-a0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.617 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.618 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.619 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.619 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=f3993b33-da9a-47a2-ba7f-6f8d5bef1ed4) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.620 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.621 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.623 182938 INFO os_vif [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b3:c9:98,bridge_name='br-int',has_traffic_filtering=True,id=93bc43c2-a00a-4c71-a3e9-b7b0306969c9,network=Network(1eee62de-b98b-4359-b27b-63fb0219f31c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93bc43c2-a0')
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.623 182938 INFO nova.virt.libvirt.driver [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Deleting instance files /var/lib/nova/instances/19b9866a-ffdf-4074-b605-12988cf688fa_del
Jan 31 01:48:31 np0005603500 nova_compute[182934]: 2026-01-31 06:48:31.624 182938 INFO nova.virt.libvirt.driver [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Deletion of /var/lib/nova/instances/19b9866a-ffdf-4074-b605-12988cf688fa_del complete
Jan 31 01:48:32 np0005603500 nova_compute[182934]: 2026-01-31 06:48:32.138 182938 INFO nova.compute.manager [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Took 1.28 seconds to destroy the instance on the hypervisor.
Jan 31 01:48:32 np0005603500 nova_compute[182934]: 2026-01-31 06:48:32.139 182938 DEBUG oslo.service.backend.eventlet.loopingcall [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Jan 31 01:48:32 np0005603500 nova_compute[182934]: 2026-01-31 06:48:32.140 182938 DEBUG nova.compute.manager [-] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Jan 31 01:48:32 np0005603500 nova_compute[182934]: 2026-01-31 06:48:32.140 182938 DEBUG nova.network.neutron [-] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Jan 31 01:48:33 np0005603500 nova_compute[182934]: 2026-01-31 06:48:33.025 182938 DEBUG nova.network.neutron [req-24ad7532-fcec-42eb-a2ae-a5cb9a9d5ae9 req-2afd86a1-d225-4b9b-b2a7-be8706fe88f4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Updated VIF entry in instance network info cache for port 3c1979fb-e961-4676-b6c2-2ca71f2d859b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:48:33 np0005603500 nova_compute[182934]: 2026-01-31 06:48:33.025 182938 DEBUG nova.network.neutron [req-24ad7532-fcec-42eb-a2ae-a5cb9a9d5ae9 req-2afd86a1-d225-4b9b-b2a7-be8706fe88f4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Updating instance_info_cache with network_info: [{"id": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "address": "fa:16:3e:a7:e9:1e", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c1979fb-e9", "ovs_interfaceid": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:48:33 np0005603500 nova_compute[182934]: 2026-01-31 06:48:33.533 182938 DEBUG oslo_concurrency.lockutils [req-24ad7532-fcec-42eb-a2ae-a5cb9a9d5ae9 req-2afd86a1-d225-4b9b-b2a7-be8706fe88f4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-5b239873-eca6-4fdc-b15d-2801c75cafa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:48:33 np0005603500 nova_compute[182934]: 2026-01-31 06:48:33.611 182938 DEBUG nova.compute.manager [req-617b6dbf-32a0-4d16-9afb-535cf2ca14a1 req-dfeeb132-3fbb-47c0-93dc-f9a476c3a171 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Received event network-vif-plugged-93bc43c2-a00a-4c71-a3e9-b7b0306969c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:48:33 np0005603500 nova_compute[182934]: 2026-01-31 06:48:33.611 182938 DEBUG oslo_concurrency.lockutils [req-617b6dbf-32a0-4d16-9afb-535cf2ca14a1 req-dfeeb132-3fbb-47c0-93dc-f9a476c3a171 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "19b9866a-ffdf-4074-b605-12988cf688fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:48:33 np0005603500 nova_compute[182934]: 2026-01-31 06:48:33.611 182938 DEBUG oslo_concurrency.lockutils [req-617b6dbf-32a0-4d16-9afb-535cf2ca14a1 req-dfeeb132-3fbb-47c0-93dc-f9a476c3a171 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "19b9866a-ffdf-4074-b605-12988cf688fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:48:33 np0005603500 nova_compute[182934]: 2026-01-31 06:48:33.612 182938 DEBUG oslo_concurrency.lockutils [req-617b6dbf-32a0-4d16-9afb-535cf2ca14a1 req-dfeeb132-3fbb-47c0-93dc-f9a476c3a171 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "19b9866a-ffdf-4074-b605-12988cf688fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:48:33 np0005603500 nova_compute[182934]: 2026-01-31 06:48:33.612 182938 DEBUG nova.compute.manager [req-617b6dbf-32a0-4d16-9afb-535cf2ca14a1 req-dfeeb132-3fbb-47c0-93dc-f9a476c3a171 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] No waiting events found dispatching network-vif-plugged-93bc43c2-a00a-4c71-a3e9-b7b0306969c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:48:33 np0005603500 nova_compute[182934]: 2026-01-31 06:48:33.612 182938 WARNING nova.compute.manager [req-617b6dbf-32a0-4d16-9afb-535cf2ca14a1 req-dfeeb132-3fbb-47c0-93dc-f9a476c3a171 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Received unexpected event network-vif-plugged-93bc43c2-a00a-4c71-a3e9-b7b0306969c9 for instance with vm_state active and task_state deleting.
Jan 31 01:48:33 np0005603500 nova_compute[182934]: 2026-01-31 06:48:33.918 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:34 np0005603500 nova_compute[182934]: 2026-01-31 06:48:34.289 182938 DEBUG nova.network.neutron [-] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:48:34 np0005603500 nova_compute[182934]: 2026-01-31 06:48:34.834 182938 INFO nova.compute.manager [-] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Took 2.69 seconds to deallocate network for instance.
Jan 31 01:48:35 np0005603500 nova_compute[182934]: 2026-01-31 06:48:35.063 182938 DEBUG nova.network.neutron [req-43fba860-7914-4372-8a0f-8a5f270a4d0b req-749df8fd-d900-4939-99e5-6e33b2e3dc87 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Updated VIF entry in instance network info cache for port 93bc43c2-a00a-4c71-a3e9-b7b0306969c9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:48:35 np0005603500 nova_compute[182934]: 2026-01-31 06:48:35.063 182938 DEBUG nova.network.neutron [req-43fba860-7914-4372-8a0f-8a5f270a4d0b req-749df8fd-d900-4939-99e5-6e33b2e3dc87 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Updating instance_info_cache with network_info: [{"id": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "address": "fa:16:3e:b3:c9:98", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93bc43c2-a0", "ovs_interfaceid": "93bc43c2-a00a-4c71-a3e9-b7b0306969c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:48:35 np0005603500 nova_compute[182934]: 2026-01-31 06:48:35.347 182938 DEBUG oslo_concurrency.lockutils [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:48:35 np0005603500 nova_compute[182934]: 2026-01-31 06:48:35.347 182938 DEBUG oslo_concurrency.lockutils [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:48:35 np0005603500 nova_compute[182934]: 2026-01-31 06:48:35.409 182938 DEBUG nova.compute.provider_tree [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:48:35 np0005603500 nova_compute[182934]: 2026-01-31 06:48:35.569 182938 DEBUG oslo_concurrency.lockutils [req-43fba860-7914-4372-8a0f-8a5f270a4d0b req-749df8fd-d900-4939-99e5-6e33b2e3dc87 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-19b9866a-ffdf-4074-b605-12988cf688fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:48:35 np0005603500 nova_compute[182934]: 2026-01-31 06:48:35.886 182938 DEBUG nova.compute.manager [req-e705d381-9866-4989-ae44-3f1efa013fc9 req-4d81bcf1-9602-4d97-bb07-edd6067faae4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Received event network-vif-deleted-93bc43c2-a00a-4c71-a3e9-b7b0306969c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:48:35 np0005603500 nova_compute[182934]: 2026-01-31 06:48:35.887 182938 INFO nova.compute.manager [req-e705d381-9866-4989-ae44-3f1efa013fc9 req-4d81bcf1-9602-4d97-bb07-edd6067faae4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Neutron deleted interface 93bc43c2-a00a-4c71-a3e9-b7b0306969c9; detaching it from the instance and deleting it from the info cache
Jan 31 01:48:35 np0005603500 nova_compute[182934]: 2026-01-31 06:48:35.887 182938 DEBUG nova.network.neutron [req-e705d381-9866-4989-ae44-3f1efa013fc9 req-4d81bcf1-9602-4d97-bb07-edd6067faae4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:48:35 np0005603500 nova_compute[182934]: 2026-01-31 06:48:35.916 182938 DEBUG nova.scheduler.client.report [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:48:36 np0005603500 nova_compute[182934]: 2026-01-31 06:48:36.396 182938 DEBUG nova.compute.manager [req-e705d381-9866-4989-ae44-3f1efa013fc9 req-4d81bcf1-9602-4d97-bb07-edd6067faae4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 19b9866a-ffdf-4074-b605-12988cf688fa] Detach interface failed, port_id=93bc43c2-a00a-4c71-a3e9-b7b0306969c9, reason: Instance 19b9866a-ffdf-4074-b605-12988cf688fa could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11571
Jan 31 01:48:36 np0005603500 nova_compute[182934]: 2026-01-31 06:48:36.428 182938 DEBUG oslo_concurrency.lockutils [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:48:36 np0005603500 nova_compute[182934]: 2026-01-31 06:48:36.450 182938 INFO nova.scheduler.client.report [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Deleted allocations for instance 19b9866a-ffdf-4074-b605-12988cf688fa
Jan 31 01:48:36 np0005603500 nova_compute[182934]: 2026-01-31 06:48:36.621 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:37 np0005603500 nova_compute[182934]: 2026-01-31 06:48:37.951 182938 DEBUG oslo_concurrency.lockutils [None req-e236703b-81b0-461d-b128-7dc3c7e5c688 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "19b9866a-ffdf-4074-b605-12988cf688fa" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:48:38 np0005603500 nova_compute[182934]: 2026-01-31 06:48:38.947 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:40 np0005603500 podman[219087]: 2026-01-31 06:48:40.140732353 +0000 UTC m=+0.051779326 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:48:40 np0005603500 nova_compute[182934]: 2026-01-31 06:48:40.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:48:40 np0005603500 podman[219088]: 2026-01-31 06:48:40.166961967 +0000 UTC m=+0.079506197 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 31 01:48:40 np0005603500 nova_compute[182934]: 2026-01-31 06:48:40.779 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:48:40 np0005603500 nova_compute[182934]: 2026-01-31 06:48:40.779 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:48:40 np0005603500 nova_compute[182934]: 2026-01-31 06:48:40.779 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:48:40 np0005603500 nova_compute[182934]: 2026-01-31 06:48:40.779 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:48:41 np0005603500 nova_compute[182934]: 2026-01-31 06:48:41.623 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:41 np0005603500 nova_compute[182934]: 2026-01-31 06:48:41.815 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:48:41 np0005603500 nova_compute[182934]: 2026-01-31 06:48:41.889 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:48:41 np0005603500 nova_compute[182934]: 2026-01-31 06:48:41.890 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:48:41 np0005603500 nova_compute[182934]: 2026-01-31 06:48:41.940 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:48:42 np0005603500 nova_compute[182934]: 2026-01-31 06:48:42.059 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:48:42 np0005603500 nova_compute[182934]: 2026-01-31 06:48:42.060 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5629MB free_disk=73.18307113647461GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:48:42 np0005603500 nova_compute[182934]: 2026-01-31 06:48:42.061 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:48:42 np0005603500 nova_compute[182934]: 2026-01-31 06:48:42.061 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:48:42 np0005603500 nova_compute[182934]: 2026-01-31 06:48:42.407 182938 DEBUG nova.compute.manager [req-74d72cf8-84a5-4aee-adb4-28045b1a0ac0 req-e167ba32-5f42-482a-a978-bd580d1b6501 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received event network-changed-3c1979fb-e961-4676-b6c2-2ca71f2d859b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:48:42 np0005603500 nova_compute[182934]: 2026-01-31 06:48:42.408 182938 DEBUG nova.compute.manager [req-74d72cf8-84a5-4aee-adb4-28045b1a0ac0 req-e167ba32-5f42-482a-a978-bd580d1b6501 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Refreshing instance network info cache due to event network-changed-3c1979fb-e961-4676-b6c2-2ca71f2d859b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:48:42 np0005603500 nova_compute[182934]: 2026-01-31 06:48:42.408 182938 DEBUG oslo_concurrency.lockutils [req-74d72cf8-84a5-4aee-adb4-28045b1a0ac0 req-e167ba32-5f42-482a-a978-bd580d1b6501 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-5b239873-eca6-4fdc-b15d-2801c75cafa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:48:42 np0005603500 nova_compute[182934]: 2026-01-31 06:48:42.408 182938 DEBUG oslo_concurrency.lockutils [req-74d72cf8-84a5-4aee-adb4-28045b1a0ac0 req-e167ba32-5f42-482a-a978-bd580d1b6501 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-5b239873-eca6-4fdc-b15d-2801c75cafa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:48:42 np0005603500 nova_compute[182934]: 2026-01-31 06:48:42.408 182938 DEBUG nova.network.neutron [req-74d72cf8-84a5-4aee-adb4-28045b1a0ac0 req-e167ba32-5f42-482a-a978-bd580d1b6501 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Refreshing network info cache for port 3c1979fb-e961-4676-b6c2-2ca71f2d859b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:48:42 np0005603500 nova_compute[182934]: 2026-01-31 06:48:42.926 182938 DEBUG oslo_concurrency.lockutils [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "5b239873-eca6-4fdc-b15d-2801c75cafa9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:48:42 np0005603500 nova_compute[182934]: 2026-01-31 06:48:42.926 182938 DEBUG oslo_concurrency.lockutils [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:48:42 np0005603500 nova_compute[182934]: 2026-01-31 06:48:42.927 182938 DEBUG oslo_concurrency.lockutils [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:48:42 np0005603500 nova_compute[182934]: 2026-01-31 06:48:42.928 182938 DEBUG oslo_concurrency.lockutils [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:48:42 np0005603500 nova_compute[182934]: 2026-01-31 06:48:42.928 182938 DEBUG oslo_concurrency.lockutils [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:48:42 np0005603500 nova_compute[182934]: 2026-01-31 06:48:42.930 182938 INFO nova.compute.manager [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Terminating instance
Jan 31 01:48:43 np0005603500 nova_compute[182934]: 2026-01-31 06:48:43.110 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Instance 5b239873-eca6-4fdc-b15d-2801c75cafa9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Jan 31 01:48:43 np0005603500 nova_compute[182934]: 2026-01-31 06:48:43.111 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:48:43 np0005603500 nova_compute[182934]: 2026-01-31 06:48:43.111 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:48:43 np0005603500 nova_compute[182934]: 2026-01-31 06:48:43.149 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:48:43 np0005603500 nova_compute[182934]: 2026-01-31 06:48:43.447 182938 DEBUG nova.compute.manager [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Jan 31 01:48:43 np0005603500 kernel: tap3c1979fb-e9 (unregistering): left promiscuous mode
Jan 31 01:48:43 np0005603500 NetworkManager[55506]: <info>  [1769842123.4754] device (tap3c1979fb-e9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 01:48:43 np0005603500 ovn_controller[95398]: 2026-01-31T06:48:43Z|00180|binding|INFO|Releasing lport 3c1979fb-e961-4676-b6c2-2ca71f2d859b from this chassis (sb_readonly=0)
Jan 31 01:48:43 np0005603500 ovn_controller[95398]: 2026-01-31T06:48:43Z|00181|binding|INFO|Setting lport 3c1979fb-e961-4676-b6c2-2ca71f2d859b down in Southbound
Jan 31 01:48:43 np0005603500 ovn_controller[95398]: 2026-01-31T06:48:43Z|00182|binding|INFO|Removing iface tap3c1979fb-e9 ovn-installed in OVS
Jan 31 01:48:43 np0005603500 nova_compute[182934]: 2026-01-31 06:48:43.480 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:43.485 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:e9:1e 10.100.0.7'], port_security=['fa:16:3e:a7:e9:1e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '5b239873-eca6-4fdc-b15d-2801c75cafa9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1eee62de-b98b-4359-b27b-63fb0219f31c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '9', 'neutron:security_group_ids': '68c99352-a463-4f0e-8dbc-014b6eb7e45f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=055d2587-07d8-48d2-bdce-e8f3e4584c68, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=3c1979fb-e961-4676-b6c2-2ca71f2d859b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:48:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:43.487 104644 INFO neutron.agent.ovn.metadata.agent [-] Port 3c1979fb-e961-4676-b6c2-2ca71f2d859b in datapath 1eee62de-b98b-4359-b27b-63fb0219f31c unbound from our chassis
Jan 31 01:48:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:43.488 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1eee62de-b98b-4359-b27b-63fb0219f31c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:48:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:43.489 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[ee0cdfa3-c033-4047-8776-86ef4c666eef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:48:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:43.490 104644 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c namespace which is not needed anymore
Jan 31 01:48:43 np0005603500 nova_compute[182934]: 2026-01-31 06:48:43.491 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:43 np0005603500 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Jan 31 01:48:43 np0005603500 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Consumed 14.942s CPU time.
Jan 31 01:48:43 np0005603500 systemd-machined[154375]: Machine qemu-11-instance-0000000b terminated.
Jan 31 01:48:43 np0005603500 podman[219163]: 2026-01-31 06:48:43.585325508 +0000 UTC m=+0.027054871 container kill 9e2ec233d5bc77e5c534487049e37914f6e196b6d60e06dde3f40e19acef4b62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 31 01:48:43 np0005603500 neutron-haproxy-ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c[218585]: [NOTICE]   (218589) : haproxy version is 2.8.14-c23fe91
Jan 31 01:48:43 np0005603500 neutron-haproxy-ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c[218585]: [NOTICE]   (218589) : path to executable is /usr/sbin/haproxy
Jan 31 01:48:43 np0005603500 neutron-haproxy-ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c[218585]: [WARNING]  (218589) : Exiting Master process...
Jan 31 01:48:43 np0005603500 neutron-haproxy-ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c[218585]: [ALERT]    (218589) : Current worker (218591) exited with code 143 (Terminated)
Jan 31 01:48:43 np0005603500 neutron-haproxy-ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c[218585]: [WARNING]  (218589) : All workers exited. Exiting... (0)
Jan 31 01:48:43 np0005603500 systemd[1]: libpod-9e2ec233d5bc77e5c534487049e37914f6e196b6d60e06dde3f40e19acef4b62.scope: Deactivated successfully.
Jan 31 01:48:43 np0005603500 conmon[218585]: conmon 9e2ec233d5bc77e5c534 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-9e2ec233d5bc77e5c534487049e37914f6e196b6d60e06dde3f40e19acef4b62.scope/container/memory.events
Jan 31 01:48:43 np0005603500 podman[219180]: 2026-01-31 06:48:43.622788898 +0000 UTC m=+0.019701847 container died 9e2ec233d5bc77e5c534487049e37914f6e196b6d60e06dde3f40e19acef4b62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:48:43 np0005603500 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9e2ec233d5bc77e5c534487049e37914f6e196b6d60e06dde3f40e19acef4b62-userdata-shm.mount: Deactivated successfully.
Jan 31 01:48:43 np0005603500 systemd[1]: var-lib-containers-storage-overlay-02739d5386649c30f52c3bdde2be6af934ef7aa6c2ce2ef9d37e5b16957b2e5f-merged.mount: Deactivated successfully.
Jan 31 01:48:43 np0005603500 nova_compute[182934]: 2026-01-31 06:48:43.661 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:48:43 np0005603500 podman[219180]: 2026-01-31 06:48:43.670766872 +0000 UTC m=+0.067679771 container remove 9e2ec233d5bc77e5c534487049e37914f6e196b6d60e06dde3f40e19acef4b62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 01:48:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:43.674 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[4d795dae-7782-4c08-98f0-16d68013cac7]: (4, ("Sat Jan 31 06:48:43 AM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c (9e2ec233d5bc77e5c534487049e37914f6e196b6d60e06dde3f40e19acef4b62)\n9e2ec233d5bc77e5c534487049e37914f6e196b6d60e06dde3f40e19acef4b62\nSat Jan 31 06:48:43 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c (9e2ec233d5bc77e5c534487049e37914f6e196b6d60e06dde3f40e19acef4b62)\n9e2ec233d5bc77e5c534487049e37914f6e196b6d60e06dde3f40e19acef4b62\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:48:43 np0005603500 systemd[1]: libpod-conmon-9e2ec233d5bc77e5c534487049e37914f6e196b6d60e06dde3f40e19acef4b62.scope: Deactivated successfully.
Jan 31 01:48:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:43.676 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[3251e109-db50-46d8-a65c-137c3b34dcaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:48:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:43.676 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1eee62de-b98b-4359-b27b-63fb0219f31c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1eee62de-b98b-4359-b27b-63fb0219f31c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:48:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:43.676 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[86396260-25a2-4717-a6f8-c91490c5c348]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:48:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:43.679 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1eee62de-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:48:43 np0005603500 nova_compute[182934]: 2026-01-31 06:48:43.680 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:43 np0005603500 kernel: tap1eee62de-b0: left promiscuous mode
Jan 31 01:48:43 np0005603500 nova_compute[182934]: 2026-01-31 06:48:43.685 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:43 np0005603500 nova_compute[182934]: 2026-01-31 06:48:43.689 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:43.691 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[2fb0c14e-77da-445c-85e3-45dc79171207]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:48:43 np0005603500 nova_compute[182934]: 2026-01-31 06:48:43.692 182938 INFO nova.virt.libvirt.driver [-] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Instance destroyed successfully.
Jan 31 01:48:43 np0005603500 nova_compute[182934]: 2026-01-31 06:48:43.693 182938 DEBUG nova.objects.instance [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'resources' on Instance uuid 5b239873-eca6-4fdc-b15d-2801c75cafa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:48:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:43.716 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[53601cb0-a47d-444e-a7fa-cb41cd81c877]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:48:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:43.717 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[523ab9f3-3403-403e-845c-f917a04687c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:48:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:43.730 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[d10baccb-a1bc-4cc1-a114-c12f5cf04e6e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434395, 'reachable_time': 28924, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219230, 'error': None, 'target': 'ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:48:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:43.733 105168 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1eee62de-b98b-4359-b27b-63fb0219f31c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 31 01:48:43 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:43.733 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[63642125-2d04-4bf6-89a0-918251bb32cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:48:43 np0005603500 systemd[1]: run-netns-ovnmeta\x2d1eee62de\x2db98b\x2d4359\x2db27b\x2d63fb0219f31c.mount: Deactivated successfully.
Jan 31 01:48:43 np0005603500 nova_compute[182934]: 2026-01-31 06:48:43.948 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.171 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.171 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.199 182938 DEBUG nova.virt.libvirt.vif [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:46:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-794782075',display_name='tempest-TestNetworkBasicOps-server-794782075',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-794782075',id=11,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE2josOk/g7wyoAw5iRuxWJY/V6rjRWyBU/8lsm36Dp437FCLAzteof7QKaa4WajgP5R2MmljJavY4Zh0uXVaRbYiaXjUL7K+61WxHZHp8+wGYv0/oOjd9Rd1zJ/UsE5PQ==',key_name='tempest-TestNetworkBasicOps-248708636',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:47:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-c8g2fcmv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:47:14Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=5b239873-eca6-4fdc-b15d-2801c75cafa9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "address": "fa:16:3e:a7:e9:1e", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c1979fb-e9", "ovs_interfaceid": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.199 182938 DEBUG nova.network.os_vif_util [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "address": "fa:16:3e:a7:e9:1e", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c1979fb-e9", "ovs_interfaceid": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.200 182938 DEBUG nova.network.os_vif_util [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a7:e9:1e,bridge_name='br-int',has_traffic_filtering=True,id=3c1979fb-e961-4676-b6c2-2ca71f2d859b,network=Network(1eee62de-b98b-4359-b27b-63fb0219f31c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c1979fb-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.200 182938 DEBUG os_vif [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a7:e9:1e,bridge_name='br-int',has_traffic_filtering=True,id=3c1979fb-e961-4676-b6c2-2ca71f2d859b,network=Network(1eee62de-b98b-4359-b27b-63fb0219f31c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c1979fb-e9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.202 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.202 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c1979fb-e9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.204 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.206 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.208 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.208 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=9de26e0d-3a1a-4a13-bb87-8b27e33dec40) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.209 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.210 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.212 182938 INFO os_vif [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a7:e9:1e,bridge_name='br-int',has_traffic_filtering=True,id=3c1979fb-e961-4676-b6c2-2ca71f2d859b,network=Network(1eee62de-b98b-4359-b27b-63fb0219f31c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c1979fb-e9')
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.213 182938 INFO nova.virt.libvirt.driver [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Deleting instance files /var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9_del
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.214 182938 INFO nova.virt.libvirt.driver [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Deletion of /var/lib/nova/instances/5b239873-eca6-4fdc-b15d-2801c75cafa9_del complete
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.584 182938 DEBUG nova.compute.manager [req-4b0b9781-89b8-4a73-80de-b12d62125323 req-2d59569a-3981-48ed-b229-e387cd602955 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received event network-vif-unplugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.584 182938 DEBUG oslo_concurrency.lockutils [req-4b0b9781-89b8-4a73-80de-b12d62125323 req-2d59569a-3981-48ed-b229-e387cd602955 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.584 182938 DEBUG oslo_concurrency.lockutils [req-4b0b9781-89b8-4a73-80de-b12d62125323 req-2d59569a-3981-48ed-b229-e387cd602955 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.584 182938 DEBUG oslo_concurrency.lockutils [req-4b0b9781-89b8-4a73-80de-b12d62125323 req-2d59569a-3981-48ed-b229-e387cd602955 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.585 182938 DEBUG nova.compute.manager [req-4b0b9781-89b8-4a73-80de-b12d62125323 req-2d59569a-3981-48ed-b229-e387cd602955 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] No waiting events found dispatching network-vif-unplugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.585 182938 DEBUG nova.compute.manager [req-4b0b9781-89b8-4a73-80de-b12d62125323 req-2d59569a-3981-48ed-b229-e387cd602955 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received event network-vif-unplugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.585 182938 DEBUG nova.compute.manager [req-4b0b9781-89b8-4a73-80de-b12d62125323 req-2d59569a-3981-48ed-b229-e387cd602955 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received event network-vif-plugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.585 182938 DEBUG oslo_concurrency.lockutils [req-4b0b9781-89b8-4a73-80de-b12d62125323 req-2d59569a-3981-48ed-b229-e387cd602955 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.585 182938 DEBUG oslo_concurrency.lockutils [req-4b0b9781-89b8-4a73-80de-b12d62125323 req-2d59569a-3981-48ed-b229-e387cd602955 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.586 182938 DEBUG oslo_concurrency.lockutils [req-4b0b9781-89b8-4a73-80de-b12d62125323 req-2d59569a-3981-48ed-b229-e387cd602955 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.586 182938 DEBUG nova.compute.manager [req-4b0b9781-89b8-4a73-80de-b12d62125323 req-2d59569a-3981-48ed-b229-e387cd602955 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] No waiting events found dispatching network-vif-plugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.586 182938 WARNING nova.compute.manager [req-4b0b9781-89b8-4a73-80de-b12d62125323 req-2d59569a-3981-48ed-b229-e387cd602955 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received unexpected event network-vif-plugged-3c1979fb-e961-4676-b6c2-2ca71f2d859b for instance with vm_state active and task_state deleting.
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.725 182938 INFO nova.compute.manager [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Took 1.28 seconds to destroy the instance on the hypervisor.
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.726 182938 DEBUG oslo.service.backend.eventlet.loopingcall [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.726 182938 DEBUG nova.compute.manager [-] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Jan 31 01:48:44 np0005603500 nova_compute[182934]: 2026-01-31 06:48:44.726 182938 DEBUG nova.network.neutron [-] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Jan 31 01:48:45 np0005603500 nova_compute[182934]: 2026-01-31 06:48:45.171 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:48:45 np0005603500 nova_compute[182934]: 2026-01-31 06:48:45.171 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:48:45 np0005603500 nova_compute[182934]: 2026-01-31 06:48:45.172 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:48:45 np0005603500 nova_compute[182934]: 2026-01-31 06:48:45.172 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:48:45 np0005603500 nova_compute[182934]: 2026-01-31 06:48:45.172 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:48:45 np0005603500 nova_compute[182934]: 2026-01-31 06:48:45.853 182938 DEBUG nova.compute.manager [req-5196a13b-6ea9-4236-8ab6-c4f6d2368f67 req-0622a687-bd7b-4970-9fd0-364f9d61e0d1 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Received event network-vif-deleted-3c1979fb-e961-4676-b6c2-2ca71f2d859b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:48:45 np0005603500 nova_compute[182934]: 2026-01-31 06:48:45.853 182938 INFO nova.compute.manager [req-5196a13b-6ea9-4236-8ab6-c4f6d2368f67 req-0622a687-bd7b-4970-9fd0-364f9d61e0d1 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Neutron deleted interface 3c1979fb-e961-4676-b6c2-2ca71f2d859b; detaching it from the instance and deleting it from the info cache
Jan 31 01:48:45 np0005603500 nova_compute[182934]: 2026-01-31 06:48:45.854 182938 DEBUG nova.network.neutron [req-5196a13b-6ea9-4236-8ab6-c4f6d2368f67 req-0622a687-bd7b-4970-9fd0-364f9d61e0d1 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:48:46 np0005603500 nova_compute[182934]: 2026-01-31 06:48:46.083 182938 DEBUG nova.network.neutron [req-74d72cf8-84a5-4aee-adb4-28045b1a0ac0 req-e167ba32-5f42-482a-a978-bd580d1b6501 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Updated VIF entry in instance network info cache for port 3c1979fb-e961-4676-b6c2-2ca71f2d859b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:48:46 np0005603500 nova_compute[182934]: 2026-01-31 06:48:46.084 182938 DEBUG nova.network.neutron [req-74d72cf8-84a5-4aee-adb4-28045b1a0ac0 req-e167ba32-5f42-482a-a978-bd580d1b6501 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Updating instance_info_cache with network_info: [{"id": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "address": "fa:16:3e:a7:e9:1e", "network": {"id": "1eee62de-b98b-4359-b27b-63fb0219f31c", "bridge": "br-int", "label": "tempest-network-smoke--1900567584", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c1979fb-e9", "ovs_interfaceid": "3c1979fb-e961-4676-b6c2-2ca71f2d859b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:48:46 np0005603500 nova_compute[182934]: 2026-01-31 06:48:46.157 182938 DEBUG nova.network.neutron [-] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:48:46 np0005603500 nova_compute[182934]: 2026-01-31 06:48:46.361 182938 DEBUG nova.compute.manager [req-5196a13b-6ea9-4236-8ab6-c4f6d2368f67 req-0622a687-bd7b-4970-9fd0-364f9d61e0d1 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Detach interface failed, port_id=3c1979fb-e961-4676-b6c2-2ca71f2d859b, reason: Instance 5b239873-eca6-4fdc-b15d-2801c75cafa9 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11571
Jan 31 01:48:46 np0005603500 nova_compute[182934]: 2026-01-31 06:48:46.590 182938 DEBUG oslo_concurrency.lockutils [req-74d72cf8-84a5-4aee-adb4-28045b1a0ac0 req-e167ba32-5f42-482a-a978-bd580d1b6501 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-5b239873-eca6-4fdc-b15d-2801c75cafa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:48:46 np0005603500 nova_compute[182934]: 2026-01-31 06:48:46.663 182938 INFO nova.compute.manager [-] [instance: 5b239873-eca6-4fdc-b15d-2801c75cafa9] Took 1.94 seconds to deallocate network for instance.
Jan 31 01:48:47 np0005603500 nova_compute[182934]: 2026-01-31 06:48:47.143 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:48:47 np0005603500 nova_compute[182934]: 2026-01-31 06:48:47.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:48:47 np0005603500 nova_compute[182934]: 2026-01-31 06:48:47.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:48:47 np0005603500 nova_compute[182934]: 2026-01-31 06:48:47.173 182938 DEBUG oslo_concurrency.lockutils [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:48:47 np0005603500 nova_compute[182934]: 2026-01-31 06:48:47.173 182938 DEBUG oslo_concurrency.lockutils [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:48:47 np0005603500 nova_compute[182934]: 2026-01-31 06:48:47.215 182938 DEBUG nova.compute.provider_tree [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:48:47 np0005603500 nova_compute[182934]: 2026-01-31 06:48:47.722 182938 DEBUG nova.scheduler.client.report [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:48:48 np0005603500 nova_compute[182934]: 2026-01-31 06:48:48.231 182938 DEBUG oslo_concurrency.lockutils [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.058s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:48:48 np0005603500 nova_compute[182934]: 2026-01-31 06:48:48.256 182938 INFO nova.scheduler.client.report [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Deleted allocations for instance 5b239873-eca6-4fdc-b15d-2801c75cafa9
Jan 31 01:48:48 np0005603500 nova_compute[182934]: 2026-01-31 06:48:48.982 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:49 np0005603500 podman[219232]: 2026-01-31 06:48:49.135635019 +0000 UTC m=+0.047272893 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1769056855, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, vcs-type=git, build-date=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z)
Jan 31 01:48:49 np0005603500 podman[219231]: 2026-01-31 06:48:49.157040459 +0000 UTC m=+0.067962480 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 01:48:49 np0005603500 nova_compute[182934]: 2026-01-31 06:48:49.209 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:49 np0005603500 nova_compute[182934]: 2026-01-31 06:48:49.272 182938 DEBUG oslo_concurrency.lockutils [None req-209e419c-1f70-49be-b49e-d982359732df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "5b239873-eca6-4fdc-b15d-2801c75cafa9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.346s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:48:53 np0005603500 nova_compute[182934]: 2026-01-31 06:48:53.983 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:54 np0005603500 nova_compute[182934]: 2026-01-31 06:48:54.210 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:54 np0005603500 nova_compute[182934]: 2026-01-31 06:48:54.643 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:54 np0005603500 nova_compute[182934]: 2026-01-31 06:48:54.666 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:56 np0005603500 podman[219279]: 2026-01-31 06:48:56.132389634 +0000 UTC m=+0.050921269 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 31 01:48:56 np0005603500 podman[219280]: 2026-01-31 06:48:56.160812157 +0000 UTC m=+0.075370966 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 31 01:48:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:56.403 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:48:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:56.403 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:48:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:56.403 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:48:58 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:58.604 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:48:58 np0005603500 nova_compute[182934]: 2026-01-31 06:48:58.604 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:58 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:48:58.605 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:48:58 np0005603500 nova_compute[182934]: 2026-01-31 06:48:58.985 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:48:59 np0005603500 nova_compute[182934]: 2026-01-31 06:48:59.212 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:00 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:00.198 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:aa:8f 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a90eb6ca-f338-4bbe-896c-93de5ac6efd5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=96523c16-0b36-40be-8125-ae3742ad344c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=33a0ff94-816a-425b-9d99-2adac00cde39) old=Port_Binding(mac=['fa:16:3e:89:aa:8f'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a90eb6ca-f338-4bbe-896c-93de5ac6efd5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:49:00 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:00.199 104644 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 33a0ff94-816a-425b-9d99-2adac00cde39 in datapath a90eb6ca-f338-4bbe-896c-93de5ac6efd5 updated
Jan 31 01:49:00 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:00.200 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a90eb6ca-f338-4bbe-896c-93de5ac6efd5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:49:00 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:00.201 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[8f5f656c-28f4-40d3-9e6f-d30edaa2e98d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:49:04 np0005603500 nova_compute[182934]: 2026-01-31 06:49:04.016 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:04 np0005603500 nova_compute[182934]: 2026-01-31 06:49:04.214 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:07 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:07.609 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:49:09 np0005603500 nova_compute[182934]: 2026-01-31 06:49:09.020 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:09 np0005603500 nova_compute[182934]: 2026-01-31 06:49:09.215 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:11 np0005603500 podman[219324]: 2026-01-31 06:49:11.122393701 +0000 UTC m=+0.041937023 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 31 01:49:11 np0005603500 podman[219323]: 2026-01-31 06:49:11.147479548 +0000 UTC m=+0.069185889 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 31 01:49:14 np0005603500 nova_compute[182934]: 2026-01-31 06:49:14.078 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:14 np0005603500 nova_compute[182934]: 2026-01-31 06:49:14.217 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:17 np0005603500 nova_compute[182934]: 2026-01-31 06:49:17.306 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "222641d4-3532-4595-b2c3-74a10b931e01" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:49:17 np0005603500 nova_compute[182934]: 2026-01-31 06:49:17.306 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "222641d4-3532-4595-b2c3-74a10b931e01" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:49:17 np0005603500 nova_compute[182934]: 2026-01-31 06:49:17.811 182938 DEBUG nova.compute.manager [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Jan 31 01:49:18 np0005603500 nova_compute[182934]: 2026-01-31 06:49:18.354 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:49:18 np0005603500 nova_compute[182934]: 2026-01-31 06:49:18.355 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:49:18 np0005603500 nova_compute[182934]: 2026-01-31 06:49:18.366 182938 DEBUG nova.virt.hardware [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Jan 31 01:49:18 np0005603500 nova_compute[182934]: 2026-01-31 06:49:18.367 182938 INFO nova.compute.claims [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Claim successful on node compute-0.ctlplane.example.com
Jan 31 01:49:19 np0005603500 nova_compute[182934]: 2026-01-31 06:49:19.121 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:19 np0005603500 nova_compute[182934]: 2026-01-31 06:49:19.219 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:19 np0005603500 nova_compute[182934]: 2026-01-31 06:49:19.424 182938 DEBUG nova.compute.provider_tree [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:49:19 np0005603500 nova_compute[182934]: 2026-01-31 06:49:19.932 182938 DEBUG nova.scheduler.client.report [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:49:20 np0005603500 podman[219366]: 2026-01-31 06:49:20.138368355 +0000 UTC m=+0.053147319 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 31 01:49:20 np0005603500 podman[219365]: 2026-01-31 06:49:20.152394091 +0000 UTC m=+0.074594961 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 31 01:49:20 np0005603500 nova_compute[182934]: 2026-01-31 06:49:20.440 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:49:20 np0005603500 nova_compute[182934]: 2026-01-31 06:49:20.441 182938 DEBUG nova.compute.manager [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Jan 31 01:49:20 np0005603500 nova_compute[182934]: 2026-01-31 06:49:20.952 182938 DEBUG nova.compute.manager [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Jan 31 01:49:20 np0005603500 nova_compute[182934]: 2026-01-31 06:49:20.952 182938 DEBUG nova.network.neutron [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Jan 31 01:49:21 np0005603500 nova_compute[182934]: 2026-01-31 06:49:21.460 182938 INFO nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 31 01:49:21 np0005603500 nova_compute[182934]: 2026-01-31 06:49:21.972 182938 DEBUG nova.compute.manager [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Jan 31 01:49:22 np0005603500 nova_compute[182934]: 2026-01-31 06:49:22.090 182938 DEBUG nova.policy [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dddc34b0385a49a5bd9bf081ed29e9fd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '829310cd8381494e96216dba067ff8d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Jan 31 01:49:22 np0005603500 nova_compute[182934]: 2026-01-31 06:49:22.992 182938 DEBUG nova.compute.manager [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Jan 31 01:49:22 np0005603500 nova_compute[182934]: 2026-01-31 06:49:22.993 182938 DEBUG nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Jan 31 01:49:22 np0005603500 nova_compute[182934]: 2026-01-31 06:49:22.993 182938 INFO nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Creating image(s)
Jan 31 01:49:22 np0005603500 nova_compute[182934]: 2026-01-31 06:49:22.994 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "/var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:49:22 np0005603500 nova_compute[182934]: 2026-01-31 06:49:22.994 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:49:22 np0005603500 nova_compute[182934]: 2026-01-31 06:49:22.994 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "/var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:49:22 np0005603500 nova_compute[182934]: 2026-01-31 06:49:22.995 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:49:22 np0005603500 nova_compute[182934]: 2026-01-31 06:49:22.998 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.000 182938 DEBUG oslo_concurrency.processutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.046 182938 DEBUG oslo_concurrency.processutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.048 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "d9035e96dc857b84194c2a2b496d294827e2de39" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.048 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.049 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.055 182938 DEBUG oslo_utils.imageutils.format_inspector [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.056 182938 DEBUG oslo_concurrency.processutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.105 182938 DEBUG oslo_concurrency.processutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.106 182938 DEBUG oslo_concurrency.processutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.129 182938 DEBUG oslo_concurrency.processutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39,backing_fmt=raw /var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01/disk 1073741824" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.130 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "d9035e96dc857b84194c2a2b496d294827e2de39" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.130 182938 DEBUG oslo_concurrency.processutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.176 182938 DEBUG oslo_concurrency.processutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d9035e96dc857b84194c2a2b496d294827e2de39 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.178 182938 DEBUG nova.virt.disk.api [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Checking if we can resize image /var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.178 182938 DEBUG oslo_concurrency.processutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.230 182938 DEBUG oslo_concurrency.processutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.231 182938 DEBUG nova.virt.disk.api [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Cannot resize image /var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.231 182938 DEBUG nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.232 182938 DEBUG nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Ensure instance console log exists: /var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.233 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.233 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:49:23 np0005603500 nova_compute[182934]: 2026-01-31 06:49:23.234 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:49:24 np0005603500 nova_compute[182934]: 2026-01-31 06:49:24.121 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:24 np0005603500 nova_compute[182934]: 2026-01-31 06:49:24.220 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:24 np0005603500 nova_compute[182934]: 2026-01-31 06:49:24.430 182938 DEBUG nova.network.neutron [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Successfully created port: c3dc7aeb-40f9-48ee-9cd3-29d158928d96 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 31 01:49:25 np0005603500 nova_compute[182934]: 2026-01-31 06:49:25.551 182938 DEBUG nova.network.neutron [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Successfully updated port: c3dc7aeb-40f9-48ee-9cd3-29d158928d96 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 31 01:49:25 np0005603500 nova_compute[182934]: 2026-01-31 06:49:25.748 182938 DEBUG nova.compute.manager [req-9c6c0c66-8095-4f4a-88dc-dc1d818b1b2d req-8015ce65-2e0c-42b6-bd08-a299a3efa758 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Received event network-changed-c3dc7aeb-40f9-48ee-9cd3-29d158928d96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:49:25 np0005603500 nova_compute[182934]: 2026-01-31 06:49:25.748 182938 DEBUG nova.compute.manager [req-9c6c0c66-8095-4f4a-88dc-dc1d818b1b2d req-8015ce65-2e0c-42b6-bd08-a299a3efa758 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Refreshing instance network info cache due to event network-changed-c3dc7aeb-40f9-48ee-9cd3-29d158928d96. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:49:25 np0005603500 nova_compute[182934]: 2026-01-31 06:49:25.749 182938 DEBUG oslo_concurrency.lockutils [req-9c6c0c66-8095-4f4a-88dc-dc1d818b1b2d req-8015ce65-2e0c-42b6-bd08-a299a3efa758 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-222641d4-3532-4595-b2c3-74a10b931e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:49:25 np0005603500 nova_compute[182934]: 2026-01-31 06:49:25.749 182938 DEBUG oslo_concurrency.lockutils [req-9c6c0c66-8095-4f4a-88dc-dc1d818b1b2d req-8015ce65-2e0c-42b6-bd08-a299a3efa758 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-222641d4-3532-4595-b2c3-74a10b931e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:49:25 np0005603500 nova_compute[182934]: 2026-01-31 06:49:25.750 182938 DEBUG nova.network.neutron [req-9c6c0c66-8095-4f4a-88dc-dc1d818b1b2d req-8015ce65-2e0c-42b6-bd08-a299a3efa758 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Refreshing network info cache for port c3dc7aeb-40f9-48ee-9cd3-29d158928d96 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:49:26 np0005603500 nova_compute[182934]: 2026-01-31 06:49:26.057 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "refresh_cache-222641d4-3532-4595-b2c3-74a10b931e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:49:27 np0005603500 nova_compute[182934]: 2026-01-31 06:49:27.081 182938 DEBUG nova.network.neutron [req-9c6c0c66-8095-4f4a-88dc-dc1d818b1b2d req-8015ce65-2e0c-42b6-bd08-a299a3efa758 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:49:27 np0005603500 podman[219429]: 2026-01-31 06:49:27.124231813 +0000 UTC m=+0.043256723 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:49:27 np0005603500 podman[219428]: 2026-01-31 06:49:27.124850321 +0000 UTC m=+0.046911508 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 01:49:29 np0005603500 nova_compute[182934]: 2026-01-31 06:49:29.053 182938 DEBUG nova.network.neutron [req-9c6c0c66-8095-4f4a-88dc-dc1d818b1b2d req-8015ce65-2e0c-42b6-bd08-a299a3efa758 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:49:29 np0005603500 nova_compute[182934]: 2026-01-31 06:49:29.123 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:29 np0005603500 nova_compute[182934]: 2026-01-31 06:49:29.222 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:29 np0005603500 nova_compute[182934]: 2026-01-31 06:49:29.567 182938 DEBUG oslo_concurrency.lockutils [req-9c6c0c66-8095-4f4a-88dc-dc1d818b1b2d req-8015ce65-2e0c-42b6-bd08-a299a3efa758 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-222641d4-3532-4595-b2c3-74a10b931e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:49:29 np0005603500 nova_compute[182934]: 2026-01-31 06:49:29.568 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquired lock "refresh_cache-222641d4-3532-4595-b2c3-74a10b931e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:49:29 np0005603500 nova_compute[182934]: 2026-01-31 06:49:29.569 182938 DEBUG nova.network.neutron [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Jan 31 01:49:30 np0005603500 nova_compute[182934]: 2026-01-31 06:49:30.511 182938 DEBUG nova.network.neutron [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Jan 31 01:49:32 np0005603500 ovn_controller[95398]: 2026-01-31T06:49:32Z|00183|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Jan 31 01:49:32 np0005603500 nova_compute[182934]: 2026-01-31 06:49:32.797 182938 DEBUG nova.network.neutron [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Updating instance_info_cache with network_info: [{"id": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "address": "fa:16:3e:62:b9:fc", "network": {"id": "a90eb6ca-f338-4bbe-896c-93de5ac6efd5", "bridge": "br-int", "label": "tempest-network-smoke--38087495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3dc7aeb-40", "ovs_interfaceid": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.304 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Releasing lock "refresh_cache-222641d4-3532-4595-b2c3-74a10b931e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.305 182938 DEBUG nova.compute.manager [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Instance network_info: |[{"id": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "address": "fa:16:3e:62:b9:fc", "network": {"id": "a90eb6ca-f338-4bbe-896c-93de5ac6efd5", "bridge": "br-int", "label": "tempest-network-smoke--38087495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3dc7aeb-40", "ovs_interfaceid": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.307 182938 DEBUG nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Start _get_guest_xml network_info=[{"id": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "address": "fa:16:3e:62:b9:fc", "network": {"id": "a90eb6ca-f338-4bbe-896c-93de5ac6efd5", "bridge": "br-int", "label": "tempest-network-smoke--38087495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3dc7aeb-40", "ovs_interfaceid": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': '9f613975-b701-42a0-9b35-7d5c4a2cb7f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.311 182938 WARNING nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.312 182938 DEBUG nova.virt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-675275927', uuid='222641d4-3532-4595-b2c3-74a10b931e01'), owner=OwnerMeta(userid='dddc34b0385a49a5bd9bf081ed29e9fd', username='tempest-TestNetworkBasicOps-1355800406-project-member', projectid='829310cd8381494e96216dba067ff8d3', projectname='tempest-TestNetworkBasicOps-1355800406'), image=ImageMeta(id='9f613975-b701-42a0-9b35-7d5c4a2cb7f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "address": "fa:16:3e:62:b9:fc", "network": {"id": "a90eb6ca-f338-4bbe-896c-93de5ac6efd5", "bridge": "br-int", "label": "tempest-network-smoke--38087495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3dc7aeb-40", "ovs_interfaceid": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1769842173.3129184) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.323 182938 DEBUG nova.virt.libvirt.host [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.324 182938 DEBUG nova.virt.libvirt.host [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.326 182938 DEBUG nova.virt.libvirt.host [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.327 182938 DEBUG nova.virt.libvirt.host [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.327 182938 DEBUG nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.327 182938 DEBUG nova.virt.hardware [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T06:29:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9956992e-a3ca-497f-9747-3ae270e07def',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T06:29:54Z,direct_url=<?>,disk_format='qcow2',id=9f613975-b701-42a0-9b35-7d5c4a2cb7f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='258ad37c7a194c8cb9fd805ff19f8fe0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T06:29:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.328 182938 DEBUG nova.virt.hardware [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.328 182938 DEBUG nova.virt.hardware [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.328 182938 DEBUG nova.virt.hardware [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.328 182938 DEBUG nova.virt.hardware [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.329 182938 DEBUG nova.virt.hardware [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.329 182938 DEBUG nova.virt.hardware [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.329 182938 DEBUG nova.virt.hardware [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.329 182938 DEBUG nova.virt.hardware [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.329 182938 DEBUG nova.virt.hardware [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.329 182938 DEBUG nova.virt.hardware [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.332 182938 DEBUG nova.virt.libvirt.vif [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:49:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-675275927',display_name='tempest-TestNetworkBasicOps-server-675275927',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-675275927',id=13,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBGmI/81T46hHx1ZYmr0K4L4E3SDc6g+Gj4NNknEonMcdrqp+7G9FPBhpsRQ/wVcJQXwOHUDwBngodLWpkPzzTyhXkknOjRgGfGepHGUKaBZ7YNVBjj+f3ZNRzcIkYUQlQ==',key_name='tempest-TestNetworkBasicOps-58601127',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-0pgyln0q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:49:22Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=222641d4-3532-4595-b2c3-74a10b931e01,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "address": "fa:16:3e:62:b9:fc", "network": {"id": "a90eb6ca-f338-4bbe-896c-93de5ac6efd5", "bridge": "br-int", "label": "tempest-network-smoke--38087495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3dc7aeb-40", "ovs_interfaceid": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.333 182938 DEBUG nova.network.os_vif_util [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "address": "fa:16:3e:62:b9:fc", "network": {"id": "a90eb6ca-f338-4bbe-896c-93de5ac6efd5", "bridge": "br-int", "label": "tempest-network-smoke--38087495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3dc7aeb-40", "ovs_interfaceid": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.333 182938 DEBUG nova.network.os_vif_util [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:b9:fc,bridge_name='br-int',has_traffic_filtering=True,id=c3dc7aeb-40f9-48ee-9cd3-29d158928d96,network=Network(a90eb6ca-f338-4bbe-896c-93de5ac6efd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3dc7aeb-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.334 182938 DEBUG nova.objects.instance [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 222641d4-3532-4595-b2c3-74a10b931e01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.841 182938 DEBUG nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] End _get_guest_xml xml=<domain type="kvm">
Jan 31 01:49:33 np0005603500 nova_compute[182934]:  <uuid>222641d4-3532-4595-b2c3-74a10b931e01</uuid>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:  <name>instance-0000000d</name>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:  <memory>131072</memory>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:  <vcpu>1</vcpu>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:  <metadata>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <nova:name>tempest-TestNetworkBasicOps-server-675275927</nova:name>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <nova:creationTime>2026-01-31 06:49:33</nova:creationTime>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <nova:flavor name="m1.nano">
Jan 31 01:49:33 np0005603500 nova_compute[182934]:        <nova:memory>128</nova:memory>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:        <nova:disk>1</nova:disk>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:        <nova:swap>0</nova:swap>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:        <nova:vcpus>1</nova:vcpus>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      </nova:flavor>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <nova:owner>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:        <nova:user uuid="dddc34b0385a49a5bd9bf081ed29e9fd">tempest-TestNetworkBasicOps-1355800406-project-member</nova:user>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:        <nova:project uuid="829310cd8381494e96216dba067ff8d3">tempest-TestNetworkBasicOps-1355800406</nova:project>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      </nova:owner>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <nova:root type="image" uuid="9f613975-b701-42a0-9b35-7d5c4a2cb7f2"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <nova:ports>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:        <nova:port uuid="c3dc7aeb-40f9-48ee-9cd3-29d158928d96">
Jan 31 01:49:33 np0005603500 nova_compute[182934]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:        </nova:port>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      </nova:ports>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    </nova:instance>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:  </metadata>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:  <sysinfo type="smbios">
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <system>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <entry name="manufacturer">RDO</entry>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <entry name="product">OpenStack Compute</entry>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <entry name="serial">222641d4-3532-4595-b2c3-74a10b931e01</entry>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <entry name="uuid">222641d4-3532-4595-b2c3-74a10b931e01</entry>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <entry name="family">Virtual Machine</entry>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    </system>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:  </sysinfo>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:  <os>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <boot dev="hd"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <smbios mode="sysinfo"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:  </os>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:  <features>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <acpi/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <apic/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <vmcoreinfo/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:  </features>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:  <clock offset="utc">
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 01:49:33 np0005603500 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <timer name="hpet" present="no"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:  </clock>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:  <cpu mode="host-model" match="exact">
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:  </cpu>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:  <devices>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <disk type="file" device="disk">
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01/disk"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <target dev="vda" bus="virtio"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <disk type="file" device="cdrom">
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <driver name="qemu" type="raw" cache="none"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <source file="/var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01/disk.config"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <target dev="sda" bus="sata"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    </disk>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <interface type="ethernet">
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <mac address="fa:16:3e:62:b9:fc"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <mtu size="1442"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <target dev="tapc3dc7aeb-40"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    </interface>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <serial type="pty">
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <log file="/var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01/console.log" append="off"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    </serial>
Jan 31 01:49:33 np0005603500 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <video>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <model type="virtio"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    </video>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <input type="tablet" bus="usb"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <rng model="virtio">
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <backend model="random">/dev/urandom</backend>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    </rng>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <controller type="usb" index="0"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    <memballoon model="virtio">
Jan 31 01:49:33 np0005603500 nova_compute[182934]:      <stats period="10"/>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:    </memballoon>
Jan 31 01:49:33 np0005603500 nova_compute[182934]:  </devices>
Jan 31 01:49:33 np0005603500 nova_compute[182934]: </domain>
Jan 31 01:49:33 np0005603500 nova_compute[182934]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.842 182938 DEBUG nova.compute.manager [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Preparing to wait for external event network-vif-plugged-c3dc7aeb-40f9-48ee-9cd3-29d158928d96 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.842 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "222641d4-3532-4595-b2c3-74a10b931e01-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.842 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "222641d4-3532-4595-b2c3-74a10b931e01-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.843 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "222641d4-3532-4595-b2c3-74a10b931e01-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.843 182938 DEBUG nova.virt.libvirt.vif [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-31T06:49:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-675275927',display_name='tempest-TestNetworkBasicOps-server-675275927',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-675275927',id=13,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBGmI/81T46hHx1ZYmr0K4L4E3SDc6g+Gj4NNknEonMcdrqp+7G9FPBhpsRQ/wVcJQXwOHUDwBngodLWpkPzzTyhXkknOjRgGfGepHGUKaBZ7YNVBjj+f3ZNRzcIkYUQlQ==',key_name='tempest-TestNetworkBasicOps-58601127',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-0pgyln0q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T06:49:22Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=222641d4-3532-4595-b2c3-74a10b931e01,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "address": "fa:16:3e:62:b9:fc", "network": {"id": "a90eb6ca-f338-4bbe-896c-93de5ac6efd5", "bridge": "br-int", "label": "tempest-network-smoke--38087495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3dc7aeb-40", "ovs_interfaceid": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.844 182938 DEBUG nova.network.os_vif_util [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "address": "fa:16:3e:62:b9:fc", "network": {"id": "a90eb6ca-f338-4bbe-896c-93de5ac6efd5", "bridge": "br-int", "label": "tempest-network-smoke--38087495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3dc7aeb-40", "ovs_interfaceid": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.844 182938 DEBUG nova.network.os_vif_util [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:b9:fc,bridge_name='br-int',has_traffic_filtering=True,id=c3dc7aeb-40f9-48ee-9cd3-29d158928d96,network=Network(a90eb6ca-f338-4bbe-896c-93de5ac6efd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3dc7aeb-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.845 182938 DEBUG os_vif [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:b9:fc,bridge_name='br-int',has_traffic_filtering=True,id=c3dc7aeb-40f9-48ee-9cd3-29d158928d96,network=Network(a90eb6ca-f338-4bbe-896c-93de5ac6efd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3dc7aeb-40') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.845 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.845 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.846 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.846 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.846 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '23175cc7-9db7-5088-9353-ff0677cfa096', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.847 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.849 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.851 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.851 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc3dc7aeb-40, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.852 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapc3dc7aeb-40, col_values=(('qos', UUID('a69d38ac-730d-4c81-9e25-3b950accd46c')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.852 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapc3dc7aeb-40, col_values=(('external_ids', {'iface-id': 'c3dc7aeb-40f9-48ee-9cd3-29d158928d96', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:62:b9:fc', 'vm-uuid': '222641d4-3532-4595-b2c3-74a10b931e01'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.853 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.854 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:33 np0005603500 NetworkManager[55506]: <info>  [1769842173.8556] manager: (tapc3dc7aeb-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.855 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.860 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:33 np0005603500 nova_compute[182934]: 2026-01-31 06:49:33.861 182938 INFO os_vif [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:b9:fc,bridge_name='br-int',has_traffic_filtering=True,id=c3dc7aeb-40f9-48ee-9cd3-29d158928d96,network=Network(a90eb6ca-f338-4bbe-896c-93de5ac6efd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3dc7aeb-40')
Jan 31 01:49:34 np0005603500 nova_compute[182934]: 2026-01-31 06:49:34.126 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:35 np0005603500 nova_compute[182934]: 2026-01-31 06:49:35.402 182938 DEBUG nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:49:35 np0005603500 nova_compute[182934]: 2026-01-31 06:49:35.403 182938 DEBUG nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Jan 31 01:49:35 np0005603500 nova_compute[182934]: 2026-01-31 06:49:35.403 182938 DEBUG nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] No VIF found with MAC fa:16:3e:62:b9:fc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Jan 31 01:49:35 np0005603500 nova_compute[182934]: 2026-01-31 06:49:35.404 182938 INFO nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Using config drive
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.081 182938 INFO nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Creating config drive at /var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01/disk.config
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.084 182938 DEBUG oslo_concurrency.processutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmprbnjqwst execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.203 182938 DEBUG oslo_concurrency.processutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmprbnjqwst" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:49:38 np0005603500 kernel: tapc3dc7aeb-40: entered promiscuous mode
Jan 31 01:49:38 np0005603500 NetworkManager[55506]: <info>  [1769842178.2443] manager: (tapc3dc7aeb-40): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Jan 31 01:49:38 np0005603500 ovn_controller[95398]: 2026-01-31T06:49:38Z|00184|binding|INFO|Claiming lport c3dc7aeb-40f9-48ee-9cd3-29d158928d96 for this chassis.
Jan 31 01:49:38 np0005603500 ovn_controller[95398]: 2026-01-31T06:49:38Z|00185|binding|INFO|c3dc7aeb-40f9-48ee-9cd3-29d158928d96: Claiming fa:16:3e:62:b9:fc 10.100.0.11
Jan 31 01:49:38 np0005603500 systemd-udevd[219491]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.288 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.290 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.292 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:38 np0005603500 NetworkManager[55506]: <info>  [1769842178.3012] device (tapc3dc7aeb-40): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:49:38 np0005603500 NetworkManager[55506]: <info>  [1769842178.3020] device (tapc3dc7aeb-40): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.308 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:b9:fc 10.100.0.11'], port_security=['fa:16:3e:62:b9:fc 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '222641d4-3532-4595-b2c3-74a10b931e01', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a90eb6ca-f338-4bbe-896c-93de5ac6efd5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '11ea60c9-e64f-4191-acf1-88e2c39ffc66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=96523c16-0b36-40be-8125-ae3742ad344c, chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=c3dc7aeb-40f9-48ee-9cd3-29d158928d96) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.309 104644 INFO neutron.agent.ovn.metadata.agent [-] Port c3dc7aeb-40f9-48ee-9cd3-29d158928d96 in datapath a90eb6ca-f338-4bbe-896c-93de5ac6efd5 bound to our chassis
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.310 104644 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a90eb6ca-f338-4bbe-896c-93de5ac6efd5
Jan 31 01:49:38 np0005603500 systemd-machined[154375]: New machine qemu-13-instance-0000000d.
Jan 31 01:49:38 np0005603500 ovn_controller[95398]: 2026-01-31T06:49:38Z|00186|binding|INFO|Setting lport c3dc7aeb-40f9-48ee-9cd3-29d158928d96 ovn-installed in OVS
Jan 31 01:49:38 np0005603500 ovn_controller[95398]: 2026-01-31T06:49:38Z|00187|binding|INFO|Setting lport c3dc7aeb-40f9-48ee-9cd3-29d158928d96 up in Southbound
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.319 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[ee638700-9e14-41a7-a603-d7bb3eb4ba07]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.320 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa90eb6ca-f1 in ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.321 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.323 210946 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa90eb6ca-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.323 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[404c4cfe-0bf9-4746-83d6-b3a7724d0cbd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.324 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[351618d9-ae92-4f3d-84d1-5d522d8f32e8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:49:38 np0005603500 systemd[1]: Started Virtual Machine qemu-13-instance-0000000d.
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.336 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[0540c775-4791-4961-ab93-29242248c541]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.346 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[a4ca8da7-4eb9-42e9-bbbf-e53fd97b862b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.366 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[f92c71bd-445e-4cf7-9d24-c7c965f1ba26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.370 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[3e946bf1-d325-417b-b4f3-65ccdd6766dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:49:38 np0005603500 NetworkManager[55506]: <info>  [1769842178.3718] manager: (tapa90eb6ca-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/88)
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.396 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[60e81274-b943-4d6e-ae42-3e9099ee86cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.401 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[614ef82c-3dd4-4896-84f6-70519399fb65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:49:38 np0005603500 NetworkManager[55506]: <info>  [1769842178.4188] device (tapa90eb6ca-f0): carrier: link connected
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.421 211809 DEBUG oslo.privsep.daemon [-] privsep: reply[5c279673-f605-4a56-ae51-29e9a2065bfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.435 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[6797448c-5d18-469e-984f-bb5afee54f42]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa90eb6ca-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:aa:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448971, 'reachable_time': 30771, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219527, 'error': None, 'target': 'ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.447 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[fd94eba1-82a9-423b-94e8-17f4d125de9e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe89:aa8f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 448971, 'tstamp': 448971}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219528, 'error': None, 'target': 'ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.461 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[a2958ff8-82c3-428e-a983-dbfdebec83f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa90eb6ca-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:aa:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448971, 'reachable_time': 30771, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219529, 'error': None, 'target': 'ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.485 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[d098297d-55c3-41e0-91cb-5b36d33c3357]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.532 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[a0d0823d-9902-4d3c-8c24-59100534c778]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.533 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa90eb6ca-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.533 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.534 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa90eb6ca-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:49:38 np0005603500 kernel: tapa90eb6ca-f0: entered promiscuous mode
Jan 31 01:49:38 np0005603500 NetworkManager[55506]: <info>  [1769842178.5373] manager: (tapa90eb6ca-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.536 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.540 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa90eb6ca-f0, col_values=(('external_ids', {'iface-id': '33a0ff94-816a-425b-9d99-2adac00cde39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:49:38 np0005603500 ovn_controller[95398]: 2026-01-31T06:49:38Z|00188|binding|INFO|Releasing lport 33a0ff94-816a-425b-9d99-2adac00cde39 from this chassis (sb_readonly=0)
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.542 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.545 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[c66e4d24-36f9-4c2b-ac13-98315cddb87b]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.546 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a90eb6ca-f338-4bbe-896c-93de5ac6efd5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a90eb6ca-f338-4bbe-896c-93de5ac6efd5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.547 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a90eb6ca-f338-4bbe-896c-93de5ac6efd5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a90eb6ca-f338-4bbe-896c-93de5ac6efd5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.547 104644 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for a90eb6ca-f338-4bbe-896c-93de5ac6efd5 disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.548 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.549 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a90eb6ca-f338-4bbe-896c-93de5ac6efd5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a90eb6ca-f338-4bbe-896c-93de5ac6efd5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.549 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[9fafa00f-ecca-4acd-aaf0-c5f79a8779e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.551 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a90eb6ca-f338-4bbe-896c-93de5ac6efd5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a90eb6ca-f338-4bbe-896c-93de5ac6efd5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.551 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[4184026b-743b-4780-9fb1-a125fd7c5c06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.552 104644 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: global
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    log         /dev/log local0 debug
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    log-tag     haproxy-metadata-proxy-a90eb6ca-f338-4bbe-896c-93de5ac6efd5
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    user        root
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    group       root
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    maxconn     1024
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    pidfile     /var/lib/neutron/external/pids/a90eb6ca-f338-4bbe-896c-93de5ac6efd5.pid.haproxy
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    daemon
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: defaults
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    log global
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    mode http
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    option httplog
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    option dontlognull
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    option http-server-close
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    option forwardfor
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    retries                 3
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    timeout http-request    30s
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    timeout connect         30s
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    timeout client          32s
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    timeout server          32s
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    timeout http-keep-alive 30s
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: listen listener
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    bind 169.254.169.254:80
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]:    http-request add-header X-OVN-Network-ID a90eb6ca-f338-4bbe-896c-93de5ac6efd5
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.553 104644 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5', 'env', 'PROCESS_TAG=haproxy-a90eb6ca-f338-4bbe-896c-93de5ac6efd5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a90eb6ca-f338-4bbe-896c-93de5ac6efd5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Jan 31 01:49:38 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:38.631 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.633 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.709 182938 DEBUG nova.compute.manager [req-e616ba18-4a05-4a9c-85c5-1a05e1624411 req-724721ab-1d4b-4e1e-8ef3-7dccc6f85110 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Received event network-vif-plugged-c3dc7aeb-40f9-48ee-9cd3-29d158928d96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.710 182938 DEBUG oslo_concurrency.lockutils [req-e616ba18-4a05-4a9c-85c5-1a05e1624411 req-724721ab-1d4b-4e1e-8ef3-7dccc6f85110 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "222641d4-3532-4595-b2c3-74a10b931e01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.710 182938 DEBUG oslo_concurrency.lockutils [req-e616ba18-4a05-4a9c-85c5-1a05e1624411 req-724721ab-1d4b-4e1e-8ef3-7dccc6f85110 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "222641d4-3532-4595-b2c3-74a10b931e01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.710 182938 DEBUG oslo_concurrency.lockutils [req-e616ba18-4a05-4a9c-85c5-1a05e1624411 req-724721ab-1d4b-4e1e-8ef3-7dccc6f85110 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "222641d4-3532-4595-b2c3-74a10b931e01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.710 182938 DEBUG nova.compute.manager [req-e616ba18-4a05-4a9c-85c5-1a05e1624411 req-724721ab-1d4b-4e1e-8ef3-7dccc6f85110 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Processing event network-vif-plugged-c3dc7aeb-40f9-48ee-9cd3-29d158928d96 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.853 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.882 182938 DEBUG nova.compute.manager [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.885 182938 DEBUG nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.888 182938 INFO nova.virt.libvirt.driver [-] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Instance spawned successfully.
Jan 31 01:49:38 np0005603500 nova_compute[182934]: 2026-01-31 06:49:38.888 182938 DEBUG nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Jan 31 01:49:38 np0005603500 podman[219566]: 2026-01-31 06:49:38.844887047 +0000 UTC m=+0.021453751 image pull d52ce0b189025039ce86fc9564595bcce243e95c598f912f021ea09cd4116a16 quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0
Jan 31 01:49:39 np0005603500 podman[219566]: 2026-01-31 06:49:39.061949628 +0000 UTC m=+0.238516312 container create 8b61c8e8c2d928cc209875d19088dc62fa309c31baee90d75ca094360493ba60 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 31 01:49:39 np0005603500 nova_compute[182934]: 2026-01-31 06:49:39.128 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:39 np0005603500 systemd[1]: Started libpod-conmon-8b61c8e8c2d928cc209875d19088dc62fa309c31baee90d75ca094360493ba60.scope.
Jan 31 01:49:39 np0005603500 systemd[1]: Started libcrun container.
Jan 31 01:49:39 np0005603500 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e9f67162d6b64724a2d19bbf00f19a3cffd7fedd7c66a8cdbc0970e41e07412/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 01:49:39 np0005603500 podman[219566]: 2026-01-31 06:49:39.243730562 +0000 UTC m=+0.420297276 container init 8b61c8e8c2d928cc209875d19088dc62fa309c31baee90d75ca094360493ba60 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Jan 31 01:49:39 np0005603500 podman[219566]: 2026-01-31 06:49:39.248464081 +0000 UTC m=+0.425030765 container start 8b61c8e8c2d928cc209875d19088dc62fa309c31baee90d75ca094360493ba60 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5, tcib_managed=true, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Jan 31 01:49:39 np0005603500 neutron-haproxy-ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5[219583]: [NOTICE]   (219587) : New worker (219589) forked
Jan 31 01:49:39 np0005603500 neutron-haproxy-ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5[219583]: [NOTICE]   (219587) : Loading success.
Jan 31 01:49:39 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:39.330 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:49:39 np0005603500 nova_compute[182934]: 2026-01-31 06:49:39.407 182938 DEBUG nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:49:39 np0005603500 nova_compute[182934]: 2026-01-31 06:49:39.409 182938 DEBUG nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:49:39 np0005603500 nova_compute[182934]: 2026-01-31 06:49:39.410 182938 DEBUG nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:49:39 np0005603500 nova_compute[182934]: 2026-01-31 06:49:39.410 182938 DEBUG nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:49:39 np0005603500 nova_compute[182934]: 2026-01-31 06:49:39.411 182938 DEBUG nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:49:39 np0005603500 nova_compute[182934]: 2026-01-31 06:49:39.411 182938 DEBUG nova.virt.libvirt.driver [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Jan 31 01:49:39 np0005603500 nova_compute[182934]: 2026-01-31 06:49:39.923 182938 INFO nova.compute.manager [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Took 16.93 seconds to spawn the instance on the hypervisor.
Jan 31 01:49:39 np0005603500 nova_compute[182934]: 2026-01-31 06:49:39.924 182938 DEBUG nova.compute.manager [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Jan 31 01:49:40 np0005603500 nova_compute[182934]: 2026-01-31 06:49:40.445 182938 INFO nova.compute.manager [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Took 22.12 seconds to build instance.
Jan 31 01:49:40 np0005603500 nova_compute[182934]: 2026-01-31 06:49:40.955 182938 DEBUG oslo_concurrency.lockutils [None req-2ec06cf7-a3a4-411f-b89a-03fd61ea2ca5 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "222641d4-3532-4595-b2c3-74a10b931e01" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 23.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:49:41 np0005603500 nova_compute[182934]: 2026-01-31 06:49:41.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:49:41 np0005603500 nova_compute[182934]: 2026-01-31 06:49:41.148 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:49:41 np0005603500 nova_compute[182934]: 2026-01-31 06:49:41.185 182938 DEBUG nova.compute.manager [req-c28b4e67-88cd-4393-816e-42e6d0fdc8e5 req-2968a8c9-828d-48cb-9926-e60cd3551867 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Received event network-vif-plugged-c3dc7aeb-40f9-48ee-9cd3-29d158928d96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:49:41 np0005603500 nova_compute[182934]: 2026-01-31 06:49:41.186 182938 DEBUG oslo_concurrency.lockutils [req-c28b4e67-88cd-4393-816e-42e6d0fdc8e5 req-2968a8c9-828d-48cb-9926-e60cd3551867 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "222641d4-3532-4595-b2c3-74a10b931e01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:49:41 np0005603500 nova_compute[182934]: 2026-01-31 06:49:41.186 182938 DEBUG oslo_concurrency.lockutils [req-c28b4e67-88cd-4393-816e-42e6d0fdc8e5 req-2968a8c9-828d-48cb-9926-e60cd3551867 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "222641d4-3532-4595-b2c3-74a10b931e01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:49:41 np0005603500 nova_compute[182934]: 2026-01-31 06:49:41.186 182938 DEBUG oslo_concurrency.lockutils [req-c28b4e67-88cd-4393-816e-42e6d0fdc8e5 req-2968a8c9-828d-48cb-9926-e60cd3551867 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "222641d4-3532-4595-b2c3-74a10b931e01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:49:41 np0005603500 nova_compute[182934]: 2026-01-31 06:49:41.187 182938 DEBUG nova.compute.manager [req-c28b4e67-88cd-4393-816e-42e6d0fdc8e5 req-2968a8c9-828d-48cb-9926-e60cd3551867 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] No waiting events found dispatching network-vif-plugged-c3dc7aeb-40f9-48ee-9cd3-29d158928d96 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:49:41 np0005603500 nova_compute[182934]: 2026-01-31 06:49:41.187 182938 WARNING nova.compute.manager [req-c28b4e67-88cd-4393-816e-42e6d0fdc8e5 req-2968a8c9-828d-48cb-9926-e60cd3551867 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Received unexpected event network-vif-plugged-c3dc7aeb-40f9-48ee-9cd3-29d158928d96 for instance with vm_state active and task_state None.
Jan 31 01:49:41 np0005603500 nova_compute[182934]: 2026-01-31 06:49:41.661 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:49:41 np0005603500 nova_compute[182934]: 2026-01-31 06:49:41.662 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:49:41 np0005603500 nova_compute[182934]: 2026-01-31 06:49:41.662 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:49:41 np0005603500 nova_compute[182934]: 2026-01-31 06:49:41.662 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:49:41 np0005603500 podman[219600]: 2026-01-31 06:49:41.742437489 +0000 UTC m=+0.046787204 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 31 01:49:41 np0005603500 podman[219601]: 2026-01-31 06:49:41.742368576 +0000 UTC m=+0.044884593 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:49:42 np0005603500 nova_compute[182934]: 2026-01-31 06:49:42.701 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:49:42 np0005603500 nova_compute[182934]: 2026-01-31 06:49:42.755 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:49:42 np0005603500 nova_compute[182934]: 2026-01-31 06:49:42.756 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:49:42 np0005603500 nova_compute[182934]: 2026-01-31 06:49:42.802 182938 DEBUG oslo_concurrency.processutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:49:42 np0005603500 nova_compute[182934]: 2026-01-31 06:49:42.924 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:49:42 np0005603500 nova_compute[182934]: 2026-01-31 06:49:42.925 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5577MB free_disk=73.21087265014648GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:49:42 np0005603500 nova_compute[182934]: 2026-01-31 06:49:42.925 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:49:42 np0005603500 nova_compute[182934]: 2026-01-31 06:49:42.926 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:49:43 np0005603500 nova_compute[182934]: 2026-01-31 06:49:43.855 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:43 np0005603500 nova_compute[182934]: 2026-01-31 06:49:43.980 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Instance 222641d4-3532-4595-b2c3-74a10b931e01 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Jan 31 01:49:43 np0005603500 nova_compute[182934]: 2026-01-31 06:49:43.981 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:49:43 np0005603500 nova_compute[182934]: 2026-01-31 06:49:43.981 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:49:44 np0005603500 nova_compute[182934]: 2026-01-31 06:49:44.022 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:49:44 np0005603500 nova_compute[182934]: 2026-01-31 06:49:44.129 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:44 np0005603500 nova_compute[182934]: 2026-01-31 06:49:44.528 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:49:45 np0005603500 nova_compute[182934]: 2026-01-31 06:49:45.039 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:49:45 np0005603500 nova_compute[182934]: 2026-01-31 06:49:45.040 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:49:46 np0005603500 nova_compute[182934]: 2026-01-31 06:49:46.039 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:49:46 np0005603500 nova_compute[182934]: 2026-01-31 06:49:46.040 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:49:47 np0005603500 nova_compute[182934]: 2026-01-31 06:49:47.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:49:47 np0005603500 nova_compute[182934]: 2026-01-31 06:49:47.147 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:49:48 np0005603500 nova_compute[182934]: 2026-01-31 06:49:48.143 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:49:48 np0005603500 nova_compute[182934]: 2026-01-31 06:49:48.143 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:49:48 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:48.332 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:49:48 np0005603500 ovn_controller[95398]: 2026-01-31T06:49:48Z|00189|binding|INFO|Releasing lport 33a0ff94-816a-425b-9d99-2adac00cde39 from this chassis (sb_readonly=0)
Jan 31 01:49:48 np0005603500 nova_compute[182934]: 2026-01-31 06:49:48.524 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:48 np0005603500 NetworkManager[55506]: <info>  [1769842188.5267] manager: (patch-br-int-to-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Jan 31 01:49:48 np0005603500 NetworkManager[55506]: <info>  [1769842188.5274] manager: (patch-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Jan 31 01:49:48 np0005603500 ovn_controller[95398]: 2026-01-31T06:49:48Z|00190|binding|INFO|Releasing lport 33a0ff94-816a-425b-9d99-2adac00cde39 from this chassis (sb_readonly=0)
Jan 31 01:49:48 np0005603500 nova_compute[182934]: 2026-01-31 06:49:48.529 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:48 np0005603500 nova_compute[182934]: 2026-01-31 06:49:48.856 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:48 np0005603500 nova_compute[182934]: 2026-01-31 06:49:48.943 182938 DEBUG nova.compute.manager [req-72ca62e5-b4b8-4b37-9141-0310fdaa3929 req-009821f4-7e2e-4c87-ac5f-e2c3072fea47 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Received event network-changed-c3dc7aeb-40f9-48ee-9cd3-29d158928d96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:49:48 np0005603500 nova_compute[182934]: 2026-01-31 06:49:48.943 182938 DEBUG nova.compute.manager [req-72ca62e5-b4b8-4b37-9141-0310fdaa3929 req-009821f4-7e2e-4c87-ac5f-e2c3072fea47 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Refreshing instance network info cache due to event network-changed-c3dc7aeb-40f9-48ee-9cd3-29d158928d96. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:49:48 np0005603500 nova_compute[182934]: 2026-01-31 06:49:48.943 182938 DEBUG oslo_concurrency.lockutils [req-72ca62e5-b4b8-4b37-9141-0310fdaa3929 req-009821f4-7e2e-4c87-ac5f-e2c3072fea47 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-222641d4-3532-4595-b2c3-74a10b931e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:49:48 np0005603500 nova_compute[182934]: 2026-01-31 06:49:48.943 182938 DEBUG oslo_concurrency.lockutils [req-72ca62e5-b4b8-4b37-9141-0310fdaa3929 req-009821f4-7e2e-4c87-ac5f-e2c3072fea47 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-222641d4-3532-4595-b2c3-74a10b931e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:49:48 np0005603500 nova_compute[182934]: 2026-01-31 06:49:48.944 182938 DEBUG nova.network.neutron [req-72ca62e5-b4b8-4b37-9141-0310fdaa3929 req-009821f4-7e2e-4c87-ac5f-e2c3072fea47 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Refreshing network info cache for port c3dc7aeb-40f9-48ee-9cd3-29d158928d96 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:49:49 np0005603500 nova_compute[182934]: 2026-01-31 06:49:49.176 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:49:49 np0005603500 nova_compute[182934]: 2026-01-31 06:49:49.177 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:49:49 np0005603500 nova_compute[182934]: 2026-01-31 06:49:49.177 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:50 np0005603500 ovn_controller[95398]: 2026-01-31T06:49:50Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:62:b9:fc 10.100.0.11
Jan 31 01:49:50 np0005603500 ovn_controller[95398]: 2026-01-31T06:49:50Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:62:b9:fc 10.100.0.11
Jan 31 01:49:51 np0005603500 podman[219662]: 2026-01-31 06:49:51.149739503 +0000 UTC m=+0.072274623 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 01:49:51 np0005603500 podman[219663]: 2026-01-31 06:49:51.154618197 +0000 UTC m=+0.071833398 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9/ubi-minimal, vcs-type=git, version=9.7, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc.)
Jan 31 01:49:53 np0005603500 nova_compute[182934]: 2026-01-31 06:49:53.862 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:54 np0005603500 nova_compute[182934]: 2026-01-31 06:49:54.086 182938 DEBUG nova.network.neutron [req-72ca62e5-b4b8-4b37-9141-0310fdaa3929 req-009821f4-7e2e-4c87-ac5f-e2c3072fea47 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Updated VIF entry in instance network info cache for port c3dc7aeb-40f9-48ee-9cd3-29d158928d96. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:49:54 np0005603500 nova_compute[182934]: 2026-01-31 06:49:54.087 182938 DEBUG nova.network.neutron [req-72ca62e5-b4b8-4b37-9141-0310fdaa3929 req-009821f4-7e2e-4c87-ac5f-e2c3072fea47 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Updating instance_info_cache with network_info: [{"id": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "address": "fa:16:3e:62:b9:fc", "network": {"id": "a90eb6ca-f338-4bbe-896c-93de5ac6efd5", "bridge": "br-int", "label": "tempest-network-smoke--38087495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3dc7aeb-40", "ovs_interfaceid": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:49:54 np0005603500 nova_compute[182934]: 2026-01-31 06:49:54.180 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:54 np0005603500 nova_compute[182934]: 2026-01-31 06:49:54.593 182938 DEBUG oslo_concurrency.lockutils [req-72ca62e5-b4b8-4b37-9141-0310fdaa3929 req-009821f4-7e2e-4c87-ac5f-e2c3072fea47 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-222641d4-3532-4595-b2c3-74a10b931e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:49:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:56.464 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:49:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:56.464 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:49:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:49:56.465 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:49:57 np0005603500 nova_compute[182934]: 2026-01-31 06:49:57.491 182938 INFO nova.compute.manager [None req-05d0a3ae-ea36-4b39-9731-2cca512713b9 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Get console output
Jan 31 01:49:57 np0005603500 nova_compute[182934]: 2026-01-31 06:49:57.495 211654 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 31 01:49:58 np0005603500 podman[219711]: 2026-01-31 06:49:58.127736527 +0000 UTC m=+0.047396053 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 31 01:49:58 np0005603500 podman[219710]: 2026-01-31 06:49:58.134010967 +0000 UTC m=+0.056973858 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 31 01:49:58 np0005603500 nova_compute[182934]: 2026-01-31 06:49:58.865 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:58 np0005603500 ovn_controller[95398]: 2026-01-31T06:49:58Z|00191|binding|INFO|Releasing lport 33a0ff94-816a-425b-9d99-2adac00cde39 from this chassis (sb_readonly=0)
Jan 31 01:49:58 np0005603500 nova_compute[182934]: 2026-01-31 06:49:58.871 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:58 np0005603500 ovn_controller[95398]: 2026-01-31T06:49:58Z|00192|binding|INFO|Releasing lport 33a0ff94-816a-425b-9d99-2adac00cde39 from this chassis (sb_readonly=0)
Jan 31 01:49:58 np0005603500 nova_compute[182934]: 2026-01-31 06:49:58.888 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:49:59 np0005603500 nova_compute[182934]: 2026-01-31 06:49:59.181 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:00 np0005603500 nova_compute[182934]: 2026-01-31 06:50:00.240 182938 INFO nova.compute.manager [None req-b750eef5-f0de-40e8-a515-849cc14e3d71 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Get console output
Jan 31 01:50:00 np0005603500 nova_compute[182934]: 2026-01-31 06:50:00.245 211654 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 31 01:50:01 np0005603500 nova_compute[182934]: 2026-01-31 06:50:01.304 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:01 np0005603500 NetworkManager[55506]: <info>  [1769842201.3061] manager: (patch-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Jan 31 01:50:01 np0005603500 NetworkManager[55506]: <info>  [1769842201.3068] manager: (patch-br-int-to-provnet-87b905e7-8a72-4b0a-8ef5-595215d6ce0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Jan 31 01:50:01 np0005603500 ovn_controller[95398]: 2026-01-31T06:50:01Z|00193|binding|INFO|Releasing lport 33a0ff94-816a-425b-9d99-2adac00cde39 from this chassis (sb_readonly=0)
Jan 31 01:50:01 np0005603500 nova_compute[182934]: 2026-01-31 06:50:01.315 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:01 np0005603500 nova_compute[182934]: 2026-01-31 06:50:01.320 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:01 np0005603500 nova_compute[182934]: 2026-01-31 06:50:01.958 182938 INFO nova.compute.manager [None req-e41fff53-01f3-48e3-8457-9114b6307df3 dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Get console output
Jan 31 01:50:01 np0005603500 nova_compute[182934]: 2026-01-31 06:50:01.961 211654 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 31 01:50:03 np0005603500 nova_compute[182934]: 2026-01-31 06:50:03.661 182938 DEBUG nova.compute.manager [req-5db86da8-f2f4-410b-ab46-649fd7c2cf06 req-61b440fa-2270-40a5-a1a7-0b7ea97ce524 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Received event network-changed-c3dc7aeb-40f9-48ee-9cd3-29d158928d96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:50:03 np0005603500 nova_compute[182934]: 2026-01-31 06:50:03.661 182938 DEBUG nova.compute.manager [req-5db86da8-f2f4-410b-ab46-649fd7c2cf06 req-61b440fa-2270-40a5-a1a7-0b7ea97ce524 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Refreshing instance network info cache due to event network-changed-c3dc7aeb-40f9-48ee-9cd3-29d158928d96. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Jan 31 01:50:03 np0005603500 nova_compute[182934]: 2026-01-31 06:50:03.661 182938 DEBUG oslo_concurrency.lockutils [req-5db86da8-f2f4-410b-ab46-649fd7c2cf06 req-61b440fa-2270-40a5-a1a7-0b7ea97ce524 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "refresh_cache-222641d4-3532-4595-b2c3-74a10b931e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Jan 31 01:50:03 np0005603500 nova_compute[182934]: 2026-01-31 06:50:03.662 182938 DEBUG oslo_concurrency.lockutils [req-5db86da8-f2f4-410b-ab46-649fd7c2cf06 req-61b440fa-2270-40a5-a1a7-0b7ea97ce524 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquired lock "refresh_cache-222641d4-3532-4595-b2c3-74a10b931e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Jan 31 01:50:03 np0005603500 nova_compute[182934]: 2026-01-31 06:50:03.662 182938 DEBUG nova.network.neutron [req-5db86da8-f2f4-410b-ab46-649fd7c2cf06 req-61b440fa-2270-40a5-a1a7-0b7ea97ce524 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Refreshing network info cache for port c3dc7aeb-40f9-48ee-9cd3-29d158928d96 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Jan 31 01:50:03 np0005603500 nova_compute[182934]: 2026-01-31 06:50:03.867 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:04 np0005603500 nova_compute[182934]: 2026-01-31 06:50:04.117 182938 DEBUG oslo_concurrency.lockutils [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "222641d4-3532-4595-b2c3-74a10b931e01" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:50:04 np0005603500 nova_compute[182934]: 2026-01-31 06:50:04.118 182938 DEBUG oslo_concurrency.lockutils [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "222641d4-3532-4595-b2c3-74a10b931e01" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:50:04 np0005603500 nova_compute[182934]: 2026-01-31 06:50:04.118 182938 DEBUG oslo_concurrency.lockutils [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "222641d4-3532-4595-b2c3-74a10b931e01-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:50:04 np0005603500 nova_compute[182934]: 2026-01-31 06:50:04.118 182938 DEBUG oslo_concurrency.lockutils [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "222641d4-3532-4595-b2c3-74a10b931e01-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:50:04 np0005603500 nova_compute[182934]: 2026-01-31 06:50:04.119 182938 DEBUG oslo_concurrency.lockutils [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "222641d4-3532-4595-b2c3-74a10b931e01-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:50:04 np0005603500 nova_compute[182934]: 2026-01-31 06:50:04.120 182938 INFO nova.compute.manager [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Terminating instance
Jan 31 01:50:04 np0005603500 nova_compute[182934]: 2026-01-31 06:50:04.183 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:04 np0005603500 nova_compute[182934]: 2026-01-31 06:50:04.627 182938 DEBUG nova.compute.manager [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Jan 31 01:50:04 np0005603500 kernel: tapc3dc7aeb-40 (unregistering): left promiscuous mode
Jan 31 01:50:04 np0005603500 NetworkManager[55506]: <info>  [1769842204.6539] device (tapc3dc7aeb-40): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 01:50:04 np0005603500 nova_compute[182934]: 2026-01-31 06:50:04.659 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:04 np0005603500 ovn_controller[95398]: 2026-01-31T06:50:04Z|00194|binding|INFO|Releasing lport c3dc7aeb-40f9-48ee-9cd3-29d158928d96 from this chassis (sb_readonly=0)
Jan 31 01:50:04 np0005603500 ovn_controller[95398]: 2026-01-31T06:50:04Z|00195|binding|INFO|Setting lport c3dc7aeb-40f9-48ee-9cd3-29d158928d96 down in Southbound
Jan 31 01:50:04 np0005603500 ovn_controller[95398]: 2026-01-31T06:50:04Z|00196|binding|INFO|Removing iface tapc3dc7aeb-40 ovn-installed in OVS
Jan 31 01:50:04 np0005603500 nova_compute[182934]: 2026-01-31 06:50:04.660 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:04 np0005603500 nova_compute[182934]: 2026-01-31 06:50:04.666 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:04 np0005603500 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Jan 31 01:50:04 np0005603500 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Consumed 12.407s CPU time.
Jan 31 01:50:04 np0005603500 systemd-machined[154375]: Machine qemu-13-instance-0000000d terminated.
Jan 31 01:50:04 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:50:04.876 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:b9:fc 10.100.0.11'], port_security=['fa:16:3e:62:b9:fc 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '222641d4-3532-4595-b2c3-74a10b931e01', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a90eb6ca-f338-4bbe-896c-93de5ac6efd5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '829310cd8381494e96216dba067ff8d3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '11ea60c9-e64f-4191-acf1-88e2c39ffc66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=96523c16-0b36-40be-8125-ae3742ad344c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>], logical_port=c3dc7aeb-40f9-48ee-9cd3-29d158928d96) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7ac43970a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:50:04 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:50:04.877 104644 INFO neutron.agent.ovn.metadata.agent [-] Port c3dc7aeb-40f9-48ee-9cd3-29d158928d96 in datapath a90eb6ca-f338-4bbe-896c-93de5ac6efd5 unbound from our chassis
Jan 31 01:50:04 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:50:04.879 104644 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a90eb6ca-f338-4bbe-896c-93de5ac6efd5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Jan 31 01:50:04 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:50:04.880 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[620c403c-f49f-40c6-876f-b92651dd08d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:50:04 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:50:04.880 104644 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5 namespace which is not needed anymore
Jan 31 01:50:04 np0005603500 nova_compute[182934]: 2026-01-31 06:50:04.883 182938 INFO nova.virt.libvirt.driver [-] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Instance destroyed successfully.
Jan 31 01:50:04 np0005603500 nova_compute[182934]: 2026-01-31 06:50:04.883 182938 DEBUG nova.objects.instance [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lazy-loading 'resources' on Instance uuid 222641d4-3532-4595-b2c3-74a10b931e01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Jan 31 01:50:04 np0005603500 neutron-haproxy-ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5[219583]: [NOTICE]   (219587) : haproxy version is 2.8.14-c23fe91
Jan 31 01:50:04 np0005603500 neutron-haproxy-ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5[219583]: [NOTICE]   (219587) : path to executable is /usr/sbin/haproxy
Jan 31 01:50:04 np0005603500 neutron-haproxy-ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5[219583]: [WARNING]  (219587) : Exiting Master process...
Jan 31 01:50:04 np0005603500 podman[219795]: 2026-01-31 06:50:04.984293112 +0000 UTC m=+0.030758906 container kill 8b61c8e8c2d928cc209875d19088dc62fa309c31baee90d75ca094360493ba60 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3)
Jan 31 01:50:04 np0005603500 neutron-haproxy-ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5[219583]: [ALERT]    (219587) : Current worker (219589) exited with code 143 (Terminated)
Jan 31 01:50:04 np0005603500 neutron-haproxy-ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5[219583]: [WARNING]  (219587) : All workers exited. Exiting... (0)
Jan 31 01:50:04 np0005603500 systemd[1]: libpod-8b61c8e8c2d928cc209875d19088dc62fa309c31baee90d75ca094360493ba60.scope: Deactivated successfully.
Jan 31 01:50:05 np0005603500 podman[219809]: 2026-01-31 06:50:05.020935114 +0000 UTC m=+0.021212083 container died 8b61c8e8c2d928cc209875d19088dc62fa309c31baee90d75ca094360493ba60 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 31 01:50:05 np0005603500 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8b61c8e8c2d928cc209875d19088dc62fa309c31baee90d75ca094360493ba60-userdata-shm.mount: Deactivated successfully.
Jan 31 01:50:05 np0005603500 systemd[1]: var-lib-containers-storage-overlay-5e9f67162d6b64724a2d19bbf00f19a3cffd7fedd7c66a8cdbc0970e41e07412-merged.mount: Deactivated successfully.
Jan 31 01:50:05 np0005603500 podman[219809]: 2026-01-31 06:50:05.052148424 +0000 UTC m=+0.052425373 container cleanup 8b61c8e8c2d928cc209875d19088dc62fa309c31baee90d75ca094360493ba60 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 01:50:05 np0005603500 systemd[1]: libpod-conmon-8b61c8e8c2d928cc209875d19088dc62fa309c31baee90d75ca094360493ba60.scope: Deactivated successfully.
Jan 31 01:50:05 np0005603500 podman[219811]: 2026-01-31 06:50:05.068927416 +0000 UTC m=+0.062099620 container remove 8b61c8e8c2d928cc209875d19088dc62fa309c31baee90d75ca094360493ba60 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=neutron-haproxy-ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:50:05 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:50:05.072 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[f95ee6e1-7234-4dc2-88fd-a0ee6294f87b]: (4, ("Sat Jan 31 06:50:04 AM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5 (8b61c8e8c2d928cc209875d19088dc62fa309c31baee90d75ca094360493ba60)\n8b61c8e8c2d928cc209875d19088dc62fa309c31baee90d75ca094360493ba60\nSat Jan 31 06:50:04 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5 (8b61c8e8c2d928cc209875d19088dc62fa309c31baee90d75ca094360493ba60)\n8b61c8e8c2d928cc209875d19088dc62fa309c31baee90d75ca094360493ba60\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:50:05 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:50:05.073 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[363c3faa-cc34-4253-91b9-7ec37b662316]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:50:05 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:50:05.074 104644 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a90eb6ca-f338-4bbe-896c-93de5ac6efd5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a90eb6ca-f338-4bbe-896c-93de5ac6efd5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Jan 31 01:50:05 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:50:05.074 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[611d1994-0ea1-4a96-8300-633d57703ef0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:50:05 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:50:05.075 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa90eb6ca-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.076 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:05 np0005603500 kernel: tapa90eb6ca-f0: left promiscuous mode
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.084 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:05 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:50:05.086 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[166097a4-2188-4249-ac3a-66e91942db22]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:50:05 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:50:05.104 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[59306728-e71d-40ac-b708-528b5c714d81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:50:05 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:50:05.104 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[af9d973d-95b9-4968-8d5c-d4025dbfa639]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:50:05 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:50:05.117 210946 DEBUG oslo.privsep.daemon [-] privsep: reply[ab7f6cf5-6ec9-476d-be10-46f6876f6a5d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448966, 'reachable_time': 32572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219846, 'error': None, 'target': 'ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:50:05 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:50:05.119 105168 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a90eb6ca-f338-4bbe-896c-93de5ac6efd5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 31 01:50:05 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:50:05.119 105168 DEBUG oslo.privsep.daemon [-] privsep: reply[2452656f-b373-4f9f-bdda-2c15d0026911]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Jan 31 01:50:05 np0005603500 systemd[1]: run-netns-ovnmeta\x2da90eb6ca\x2df338\x2d4bbe\x2d896c\x2d93de5ac6efd5.mount: Deactivated successfully.
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.389 182938 DEBUG nova.virt.libvirt.vif [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-31T06:49:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-675275927',display_name='tempest-TestNetworkBasicOps-server-675275927',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-675275927',id=13,image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBGmI/81T46hHx1ZYmr0K4L4E3SDc6g+Gj4NNknEonMcdrqp+7G9FPBhpsRQ/wVcJQXwOHUDwBngodLWpkPzzTyhXkknOjRgGfGepHGUKaBZ7YNVBjj+f3ZNRzcIkYUQlQ==',key_name='tempest-TestNetworkBasicOps-58601127',keypairs=<?>,launch_index=0,launched_at=2026-01-31T06:49:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='829310cd8381494e96216dba067ff8d3',ramdisk_id='',reservation_id='r-0pgyln0q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9f613975-b701-42a0-9b35-7d5c4a2cb7f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1355800406',owner_user_name='tempest-TestNetworkBasicOps-1355800406-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T06:49:39Z,user_data=None,user_id='dddc34b0385a49a5bd9bf081ed29e9fd',uuid=222641d4-3532-4595-b2c3-74a10b931e01,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "address": "fa:16:3e:62:b9:fc", "network": {"id": "a90eb6ca-f338-4bbe-896c-93de5ac6efd5", "bridge": "br-int", "label": "tempest-network-smoke--38087495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3dc7aeb-40", "ovs_interfaceid": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.390 182938 DEBUG nova.network.os_vif_util [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converting VIF {"id": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "address": "fa:16:3e:62:b9:fc", "network": {"id": "a90eb6ca-f338-4bbe-896c-93de5ac6efd5", "bridge": "br-int", "label": "tempest-network-smoke--38087495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3dc7aeb-40", "ovs_interfaceid": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.390 182938 DEBUG nova.network.os_vif_util [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:62:b9:fc,bridge_name='br-int',has_traffic_filtering=True,id=c3dc7aeb-40f9-48ee-9cd3-29d158928d96,network=Network(a90eb6ca-f338-4bbe-896c-93de5ac6efd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3dc7aeb-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.391 182938 DEBUG os_vif [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:b9:fc,bridge_name='br-int',has_traffic_filtering=True,id=c3dc7aeb-40f9-48ee-9cd3-29d158928d96,network=Network(a90eb6ca-f338-4bbe-896c-93de5ac6efd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3dc7aeb-40') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.393 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.393 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3dc7aeb-40, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.395 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.398 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.399 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.399 182938 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=a69d38ac-730d-4c81-9e25-3b950accd46c) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.400 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.401 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.402 182938 INFO os_vif [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:b9:fc,bridge_name='br-int',has_traffic_filtering=True,id=c3dc7aeb-40f9-48ee-9cd3-29d158928d96,network=Network(a90eb6ca-f338-4bbe-896c-93de5ac6efd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3dc7aeb-40')
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.403 182938 INFO nova.virt.libvirt.driver [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Deleting instance files /var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01_del
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.403 182938 INFO nova.virt.libvirt.driver [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Deletion of /var/lib/nova/instances/222641d4-3532-4595-b2c3-74a10b931e01_del complete
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.913 182938 DEBUG nova.compute.manager [req-7e70a25b-9ba9-4322-8e89-e9ab7a411a1f req-546d0162-c6e8-4cc5-a45f-efb8efb0ba41 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Received event network-vif-unplugged-c3dc7aeb-40f9-48ee-9cd3-29d158928d96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.914 182938 DEBUG oslo_concurrency.lockutils [req-7e70a25b-9ba9-4322-8e89-e9ab7a411a1f req-546d0162-c6e8-4cc5-a45f-efb8efb0ba41 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "222641d4-3532-4595-b2c3-74a10b931e01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.914 182938 DEBUG oslo_concurrency.lockutils [req-7e70a25b-9ba9-4322-8e89-e9ab7a411a1f req-546d0162-c6e8-4cc5-a45f-efb8efb0ba41 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "222641d4-3532-4595-b2c3-74a10b931e01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.914 182938 DEBUG oslo_concurrency.lockutils [req-7e70a25b-9ba9-4322-8e89-e9ab7a411a1f req-546d0162-c6e8-4cc5-a45f-efb8efb0ba41 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "222641d4-3532-4595-b2c3-74a10b931e01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.915 182938 DEBUG nova.compute.manager [req-7e70a25b-9ba9-4322-8e89-e9ab7a411a1f req-546d0162-c6e8-4cc5-a45f-efb8efb0ba41 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] No waiting events found dispatching network-vif-unplugged-c3dc7aeb-40f9-48ee-9cd3-29d158928d96 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.915 182938 DEBUG nova.compute.manager [req-7e70a25b-9ba9-4322-8e89-e9ab7a411a1f req-546d0162-c6e8-4cc5-a45f-efb8efb0ba41 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Received event network-vif-unplugged-c3dc7aeb-40f9-48ee-9cd3-29d158928d96 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.915 182938 DEBUG nova.compute.manager [req-7e70a25b-9ba9-4322-8e89-e9ab7a411a1f req-546d0162-c6e8-4cc5-a45f-efb8efb0ba41 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Received event network-vif-plugged-c3dc7aeb-40f9-48ee-9cd3-29d158928d96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.915 182938 DEBUG oslo_concurrency.lockutils [req-7e70a25b-9ba9-4322-8e89-e9ab7a411a1f req-546d0162-c6e8-4cc5-a45f-efb8efb0ba41 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Acquiring lock "222641d4-3532-4595-b2c3-74a10b931e01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.916 182938 DEBUG oslo_concurrency.lockutils [req-7e70a25b-9ba9-4322-8e89-e9ab7a411a1f req-546d0162-c6e8-4cc5-a45f-efb8efb0ba41 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "222641d4-3532-4595-b2c3-74a10b931e01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.916 182938 DEBUG oslo_concurrency.lockutils [req-7e70a25b-9ba9-4322-8e89-e9ab7a411a1f req-546d0162-c6e8-4cc5-a45f-efb8efb0ba41 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Lock "222641d4-3532-4595-b2c3-74a10b931e01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.916 182938 DEBUG nova.compute.manager [req-7e70a25b-9ba9-4322-8e89-e9ab7a411a1f req-546d0162-c6e8-4cc5-a45f-efb8efb0ba41 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] No waiting events found dispatching network-vif-plugged-c3dc7aeb-40f9-48ee-9cd3-29d158928d96 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.916 182938 WARNING nova.compute.manager [req-7e70a25b-9ba9-4322-8e89-e9ab7a411a1f req-546d0162-c6e8-4cc5-a45f-efb8efb0ba41 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Received unexpected event network-vif-plugged-c3dc7aeb-40f9-48ee-9cd3-29d158928d96 for instance with vm_state active and task_state deleting.
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.920 182938 INFO nova.compute.manager [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Took 1.29 seconds to destroy the instance on the hypervisor.
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.921 182938 DEBUG oslo.service.backend.eventlet.loopingcall [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.921 182938 DEBUG nova.compute.manager [-] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Jan 31 01:50:05 np0005603500 nova_compute[182934]: 2026-01-31 06:50:05.921 182938 DEBUG nova.network.neutron [-] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Jan 31 01:50:07 np0005603500 nova_compute[182934]: 2026-01-31 06:50:07.550 182938 DEBUG nova.compute.manager [req-7f3daf48-1f5c-4f62-928b-5c8b90936d01 req-4589c705-023c-4690-be41-8f4692c640d4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Received event network-vif-deleted-c3dc7aeb-40f9-48ee-9cd3-29d158928d96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Jan 31 01:50:07 np0005603500 nova_compute[182934]: 2026-01-31 06:50:07.551 182938 INFO nova.compute.manager [req-7f3daf48-1f5c-4f62-928b-5c8b90936d01 req-4589c705-023c-4690-be41-8f4692c640d4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Neutron deleted interface c3dc7aeb-40f9-48ee-9cd3-29d158928d96; detaching it from the instance and deleting it from the info cache
Jan 31 01:50:07 np0005603500 nova_compute[182934]: 2026-01-31 06:50:07.551 182938 DEBUG nova.network.neutron [req-7f3daf48-1f5c-4f62-928b-5c8b90936d01 req-4589c705-023c-4690-be41-8f4692c640d4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:50:07 np0005603500 nova_compute[182934]: 2026-01-31 06:50:07.827 182938 DEBUG nova.network.neutron [-] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:50:08 np0005603500 nova_compute[182934]: 2026-01-31 06:50:08.058 182938 DEBUG nova.compute.manager [req-7f3daf48-1f5c-4f62-928b-5c8b90936d01 req-4589c705-023c-4690-be41-8f4692c640d4 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Detach interface failed, port_id=c3dc7aeb-40f9-48ee-9cd3-29d158928d96, reason: Instance 222641d4-3532-4595-b2c3-74a10b931e01 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11571
Jan 31 01:50:08 np0005603500 nova_compute[182934]: 2026-01-31 06:50:08.091 182938 DEBUG nova.network.neutron [req-5db86da8-f2f4-410b-ab46-649fd7c2cf06 req-61b440fa-2270-40a5-a1a7-0b7ea97ce524 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Updated VIF entry in instance network info cache for port c3dc7aeb-40f9-48ee-9cd3-29d158928d96. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Jan 31 01:50:08 np0005603500 nova_compute[182934]: 2026-01-31 06:50:08.092 182938 DEBUG nova.network.neutron [req-5db86da8-f2f4-410b-ab46-649fd7c2cf06 req-61b440fa-2270-40a5-a1a7-0b7ea97ce524 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Updating instance_info_cache with network_info: [{"id": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "address": "fa:16:3e:62:b9:fc", "network": {"id": "a90eb6ca-f338-4bbe-896c-93de5ac6efd5", "bridge": "br-int", "label": "tempest-network-smoke--38087495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "829310cd8381494e96216dba067ff8d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3dc7aeb-40", "ovs_interfaceid": "c3dc7aeb-40f9-48ee-9cd3-29d158928d96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 31 01:50:08 np0005603500 nova_compute[182934]: 2026-01-31 06:50:08.350 182938 INFO nova.compute.manager [-] [instance: 222641d4-3532-4595-b2c3-74a10b931e01] Took 2.43 seconds to deallocate network for instance.
Jan 31 01:50:08 np0005603500 nova_compute[182934]: 2026-01-31 06:50:08.622 182938 DEBUG oslo_concurrency.lockutils [req-5db86da8-f2f4-410b-ab46-649fd7c2cf06 req-61b440fa-2270-40a5-a1a7-0b7ea97ce524 837e5b291c2c4d0fb1322fcb296cac6b 46a8293b1f6c4f4a8b41da2ca6121e0a - - default default] Releasing lock "refresh_cache-222641d4-3532-4595-b2c3-74a10b931e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Jan 31 01:50:08 np0005603500 nova_compute[182934]: 2026-01-31 06:50:08.858 182938 DEBUG oslo_concurrency.lockutils [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:50:08 np0005603500 nova_compute[182934]: 2026-01-31 06:50:08.859 182938 DEBUG oslo_concurrency.lockutils [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:50:08 np0005603500 nova_compute[182934]: 2026-01-31 06:50:08.933 182938 DEBUG nova.compute.provider_tree [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:50:09 np0005603500 nova_compute[182934]: 2026-01-31 06:50:09.185 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:09 np0005603500 nova_compute[182934]: 2026-01-31 06:50:09.441 182938 DEBUG nova.scheduler.client.report [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:50:09 np0005603500 nova_compute[182934]: 2026-01-31 06:50:09.954 182938 DEBUG oslo_concurrency.lockutils [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:50:09 np0005603500 nova_compute[182934]: 2026-01-31 06:50:09.975 182938 INFO nova.scheduler.client.report [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Deleted allocations for instance 222641d4-3532-4595-b2c3-74a10b931e01
Jan 31 01:50:10 np0005603500 nova_compute[182934]: 2026-01-31 06:50:10.400 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:10 np0005603500 nova_compute[182934]: 2026-01-31 06:50:10.994 182938 DEBUG oslo_concurrency.lockutils [None req-53812c27-ffbc-43ae-b7a0-ebcb18fa05df dddc34b0385a49a5bd9bf081ed29e9fd 829310cd8381494e96216dba067ff8d3 - - default default] Lock "222641d4-3532-4595-b2c3-74a10b931e01" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.876s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:50:12 np0005603500 podman[219848]: 2026-01-31 06:50:12.131159203 +0000 UTC m=+0.044496812 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 01:50:12 np0005603500 podman[219847]: 2026-01-31 06:50:12.135370237 +0000 UTC m=+0.050381918 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 31 01:50:14 np0005603500 nova_compute[182934]: 2026-01-31 06:50:14.187 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:15 np0005603500 nova_compute[182934]: 2026-01-31 06:50:15.401 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:16 np0005603500 nova_compute[182934]: 2026-01-31 06:50:16.336 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:16 np0005603500 nova_compute[182934]: 2026-01-31 06:50:16.353 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f199f44d4f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f199f43bbe0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f199f43bca0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f199f44d6a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f199f43baf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f199f43b340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f199f44d940>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f199f43b0d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f199f44d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f199f44d040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f199f451250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f199f44d760>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f199f44d3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f199f44dcd0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f199f44dc10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f199f44d220>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f199f44db50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f199f43b3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f199f44d2e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f199f43bdf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f199f43b550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f19a53f3b80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f199f43b700>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f199f43b490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f199f44d160>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f199f436bb0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:50:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:50:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:50:19 np0005603500 nova_compute[182934]: 2026-01-31 06:50:19.188 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:20 np0005603500 nova_compute[182934]: 2026-01-31 06:50:20.402 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:22 np0005603500 podman[219889]: 2026-01-31 06:50:22.144725226 +0000 UTC m=+0.053066864 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1769056855, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, container_name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.openshift.tags=minimal rhel9)
Jan 31 01:50:22 np0005603500 podman[219888]: 2026-01-31 06:50:22.159313549 +0000 UTC m=+0.068013098 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 01:50:24 np0005603500 nova_compute[182934]: 2026-01-31 06:50:24.192 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:25 np0005603500 nova_compute[182934]: 2026-01-31 06:50:25.405 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:29 np0005603500 podman[219939]: 2026-01-31 06:50:29.135507728 +0000 UTC m=+0.052290589 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 31 01:50:29 np0005603500 podman[219938]: 2026-01-31 06:50:29.153696735 +0000 UTC m=+0.074464213 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 31 01:50:29 np0005603500 nova_compute[182934]: 2026-01-31 06:50:29.193 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:30 np0005603500 nova_compute[182934]: 2026-01-31 06:50:30.407 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:34 np0005603500 nova_compute[182934]: 2026-01-31 06:50:34.195 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:35 np0005603500 nova_compute[182934]: 2026-01-31 06:50:35.148 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:50:35 np0005603500 nova_compute[182934]: 2026-01-31 06:50:35.408 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:39 np0005603500 nova_compute[182934]: 2026-01-31 06:50:39.197 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:40 np0005603500 nova_compute[182934]: 2026-01-31 06:50:40.411 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:42 np0005603500 nova_compute[182934]: 2026-01-31 06:50:42.717 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:50:43 np0005603500 podman[219982]: 2026-01-31 06:50:43.129386531 +0000 UTC m=+0.048764978 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:50:43 np0005603500 nova_compute[182934]: 2026-01-31 06:50:43.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:50:43 np0005603500 nova_compute[182934]: 2026-01-31 06:50:43.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:50:43 np0005603500 podman[219983]: 2026-01-31 06:50:43.16153361 +0000 UTC m=+0.077767977 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:50:43 np0005603500 nova_compute[182934]: 2026-01-31 06:50:43.671 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:50:43 np0005603500 nova_compute[182934]: 2026-01-31 06:50:43.671 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:50:43 np0005603500 nova_compute[182934]: 2026-01-31 06:50:43.671 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:50:43 np0005603500 nova_compute[182934]: 2026-01-31 06:50:43.672 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:50:43 np0005603500 nova_compute[182934]: 2026-01-31 06:50:43.787 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:50:43 np0005603500 nova_compute[182934]: 2026-01-31 06:50:43.788 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5790MB free_disk=73.21175765991211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:50:43 np0005603500 nova_compute[182934]: 2026-01-31 06:50:43.788 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:50:43 np0005603500 nova_compute[182934]: 2026-01-31 06:50:43.788 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:50:44 np0005603500 nova_compute[182934]: 2026-01-31 06:50:44.199 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:44 np0005603500 nova_compute[182934]: 2026-01-31 06:50:44.838 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:50:44 np0005603500 nova_compute[182934]: 2026-01-31 06:50:44.838 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:50:44 np0005603500 nova_compute[182934]: 2026-01-31 06:50:44.859 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:50:45 np0005603500 nova_compute[182934]: 2026-01-31 06:50:45.413 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:46 np0005603500 ovn_controller[95398]: 2026-01-31T06:50:46Z|00197|memory_trim|INFO|Detected inactivity (last active 30026 ms ago): trimming memory
Jan 31 01:50:48 np0005603500 nova_compute[182934]: 2026-01-31 06:50:48.176 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:50:48 np0005603500 nova_compute[182934]: 2026-01-31 06:50:48.685 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:50:48 np0005603500 nova_compute[182934]: 2026-01-31 06:50:48.686 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.898s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:50:49 np0005603500 nova_compute[182934]: 2026-01-31 06:50:49.238 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:50 np0005603500 nova_compute[182934]: 2026-01-31 06:50:50.437 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:50 np0005603500 nova_compute[182934]: 2026-01-31 06:50:50.687 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:50:50 np0005603500 nova_compute[182934]: 2026-01-31 06:50:50.687 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:50:50 np0005603500 nova_compute[182934]: 2026-01-31 06:50:50.687 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:50:50 np0005603500 nova_compute[182934]: 2026-01-31 06:50:50.687 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:50:50 np0005603500 nova_compute[182934]: 2026-01-31 06:50:50.688 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:50:51 np0005603500 nova_compute[182934]: 2026-01-31 06:50:51.148 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:50:53 np0005603500 podman[220025]: 2026-01-31 06:50:53.130418835 +0000 UTC m=+0.049264683 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, distribution-scope=public, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1769056855, config_id=openstack_network_exporter)
Jan 31 01:50:53 np0005603500 podman[220024]: 2026-01-31 06:50:53.179716549 +0000 UTC m=+0.100491327 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 01:50:54 np0005603500 nova_compute[182934]: 2026-01-31 06:50:54.239 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:55 np0005603500 nova_compute[182934]: 2026-01-31 06:50:55.439 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:50:56 np0005603500 nova_compute[182934]: 2026-01-31 06:50:56.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:50:56 np0005603500 nova_compute[182934]: 2026-01-31 06:50:56.147 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11834
Jan 31 01:50:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:50:56.483 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:50:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:50:56.483 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:50:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:50:56.484 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:50:56 np0005603500 nova_compute[182934]: 2026-01-31 06:50:56.661 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11843
Jan 31 01:50:59 np0005603500 nova_compute[182934]: 2026-01-31 06:50:59.240 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:00 np0005603500 podman[220071]: 2026-01-31 06:51:00.125389091 +0000 UTC m=+0.042286041 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 31 01:51:00 np0005603500 podman[220070]: 2026-01-31 06:51:00.147344587 +0000 UTC m=+0.069677300 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true)
Jan 31 01:51:00 np0005603500 nova_compute[182934]: 2026-01-31 06:51:00.442 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:02 np0005603500 systemd-logind[821]: New session 26 of user zuul.
Jan 31 01:51:02 np0005603500 systemd[1]: Started Session 26 of User zuul.
Jan 31 01:51:04 np0005603500 nova_compute[182934]: 2026-01-31 06:51:04.284 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:05 np0005603500 nova_compute[182934]: 2026-01-31 06:51:05.445 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:06 np0005603500 ovs-vsctl[220289]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 31 01:51:07 np0005603500 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 220142 (sos)
Jan 31 01:51:07 np0005603500 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 31 01:51:07 np0005603500 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 31 01:51:07 np0005603500 virtqemud[183236]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 31 01:51:07 np0005603500 virtqemud[183236]: hostname: compute-0
Jan 31 01:51:07 np0005603500 virtqemud[183236]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 31 01:51:07 np0005603500 virtqemud[183236]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 31 01:51:07 np0005603500 virtqemud[183236]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 31 01:51:09 np0005603500 nova_compute[182934]: 2026-01-31 06:51:09.285 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:09 np0005603500 systemd[1]: Starting Hostname Service...
Jan 31 01:51:09 np0005603500 systemd[1]: Started Hostname Service.
Jan 31 01:51:10 np0005603500 nova_compute[182934]: 2026-01-31 06:51:10.446 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:14 np0005603500 nova_compute[182934]: 2026-01-31 06:51:14.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:51:14 np0005603500 nova_compute[182934]: 2026-01-31 06:51:14.148 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11872
Jan 31 01:51:14 np0005603500 podman[221195]: 2026-01-31 06:51:14.226295737 +0000 UTC m=+0.056436760 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 31 01:51:14 np0005603500 podman[221208]: 2026-01-31 06:51:14.245731554 +0000 UTC m=+0.070302590 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 31 01:51:14 np0005603500 nova_compute[182934]: 2026-01-31 06:51:14.287 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:15 np0005603500 nova_compute[182934]: 2026-01-31 06:51:15.448 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:15 np0005603500 ovs-appctl[221833]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 31 01:51:15 np0005603500 ovs-appctl[221840]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 31 01:51:15 np0005603500 ovs-appctl[221849]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 31 01:51:19 np0005603500 nova_compute[182934]: 2026-01-31 06:51:19.290 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:20 np0005603500 nova_compute[182934]: 2026-01-31 06:51:20.450 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:22 np0005603500 virtqemud[183236]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 31 01:51:23 np0005603500 podman[223310]: 2026-01-31 06:51:23.234031353 +0000 UTC m=+0.071205158 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public)
Jan 31 01:51:23 np0005603500 podman[223327]: 2026-01-31 06:51:23.29448313 +0000 UTC m=+0.084565692 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 31 01:51:24 np0005603500 systemd[1]: Starting Time & Date Service...
Jan 31 01:51:24 np0005603500 systemd[1]: Started Time & Date Service.
Jan 31 01:51:24 np0005603500 nova_compute[182934]: 2026-01-31 06:51:24.292 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:25 np0005603500 nova_compute[182934]: 2026-01-31 06:51:25.452 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:29 np0005603500 nova_compute[182934]: 2026-01-31 06:51:29.293 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:30 np0005603500 podman[223456]: 2026-01-31 06:51:30.229657828 +0000 UTC m=+0.060533721 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 31 01:51:30 np0005603500 podman[223455]: 2026-01-31 06:51:30.244335532 +0000 UTC m=+0.079100368 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 31 01:51:30 np0005603500 nova_compute[182934]: 2026-01-31 06:51:30.454 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:34 np0005603500 nova_compute[182934]: 2026-01-31 06:51:34.297 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:35 np0005603500 nova_compute[182934]: 2026-01-31 06:51:35.455 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:39 np0005603500 nova_compute[182934]: 2026-01-31 06:51:39.297 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:40 np0005603500 nova_compute[182934]: 2026-01-31 06:51:40.457 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:42 np0005603500 nova_compute[182934]: 2026-01-31 06:51:42.700 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:51:44 np0005603500 nova_compute[182934]: 2026-01-31 06:51:44.298 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:44 np0005603500 podman[223500]: 2026-01-31 06:51:44.887286074 +0000 UTC m=+0.046356900 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 31 01:51:44 np0005603500 podman[223499]: 2026-01-31 06:51:44.909291092 +0000 UTC m=+0.067747479 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 31 01:51:45 np0005603500 nova_compute[182934]: 2026-01-31 06:51:45.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:51:45 np0005603500 nova_compute[182934]: 2026-01-31 06:51:45.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:51:45 np0005603500 nova_compute[182934]: 2026-01-31 06:51:45.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:51:45 np0005603500 systemd[1]: session-26.scope: Deactivated successfully.
Jan 31 01:51:45 np0005603500 systemd[1]: session-26.scope: Consumed 1min 8.003s CPU time, 507.5M memory peak, read 103.1M from disk, written 27.7M to disk.
Jan 31 01:51:45 np0005603500 systemd-logind[821]: Session 26 logged out. Waiting for processes to exit.
Jan 31 01:51:45 np0005603500 systemd-logind[821]: Removed session 26.
Jan 31 01:51:45 np0005603500 nova_compute[182934]: 2026-01-31 06:51:45.458 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:45 np0005603500 systemd-logind[821]: New session 27 of user zuul.
Jan 31 01:51:45 np0005603500 systemd[1]: Started Session 27 of User zuul.
Jan 31 01:51:45 np0005603500 systemd[1]: session-27.scope: Deactivated successfully.
Jan 31 01:51:45 np0005603500 systemd-logind[821]: Session 27 logged out. Waiting for processes to exit.
Jan 31 01:51:45 np0005603500 systemd-logind[821]: Removed session 27.
Jan 31 01:51:45 np0005603500 nova_compute[182934]: 2026-01-31 06:51:45.683 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:51:45 np0005603500 nova_compute[182934]: 2026-01-31 06:51:45.683 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:51:45 np0005603500 nova_compute[182934]: 2026-01-31 06:51:45.683 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:51:45 np0005603500 nova_compute[182934]: 2026-01-31 06:51:45.683 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:51:45 np0005603500 systemd-logind[821]: New session 28 of user zuul.
Jan 31 01:51:45 np0005603500 systemd[1]: Started Session 28 of User zuul.
Jan 31 01:51:45 np0005603500 nova_compute[182934]: 2026-01-31 06:51:45.821 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:51:45 np0005603500 nova_compute[182934]: 2026-01-31 06:51:45.823 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5529MB free_disk=73.19497680664062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:51:45 np0005603500 nova_compute[182934]: 2026-01-31 06:51:45.823 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:51:45 np0005603500 nova_compute[182934]: 2026-01-31 06:51:45.824 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:51:45 np0005603500 systemd[1]: session-28.scope: Deactivated successfully.
Jan 31 01:51:45 np0005603500 systemd-logind[821]: Session 28 logged out. Waiting for processes to exit.
Jan 31 01:51:45 np0005603500 systemd-logind[821]: Removed session 28.
Jan 31 01:51:47 np0005603500 nova_compute[182934]: 2026-01-31 06:51:47.379 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:51:47 np0005603500 nova_compute[182934]: 2026-01-31 06:51:47.379 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:51:47 np0005603500 nova_compute[182934]: 2026-01-31 06:51:47.628 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Refreshing inventories for resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:822
Jan 31 01:51:47 np0005603500 nova_compute[182934]: 2026-01-31 06:51:47.892 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Updating ProviderTree inventory for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:786
Jan 31 01:51:47 np0005603500 nova_compute[182934]: 2026-01-31 06:51:47.893 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Updating inventory in ProviderTree for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 31 01:51:47 np0005603500 nova_compute[182934]: 2026-01-31 06:51:47.912 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Refreshing aggregate associations for resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:831
Jan 31 01:51:47 np0005603500 nova_compute[182934]: 2026-01-31 06:51:47.948 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Refreshing trait associations for resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59, traits: COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,HW_CPU_X86_BMI2,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,HW_ARCH_X86_64,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_ABM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_CRB,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_SHA,HW_CPU_X86_CLMUL,HW_CPU_X86_AVX,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_TIS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_FMA3,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SVM,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:843
Jan 31 01:51:47 np0005603500 nova_compute[182934]: 2026-01-31 06:51:47.971 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:51:48 np0005603500 nova_compute[182934]: 2026-01-31 06:51:48.491 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:51:48 np0005603500 nova_compute[182934]: 2026-01-31 06:51:48.492 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:51:48 np0005603500 nova_compute[182934]: 2026-01-31 06:51:48.493 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.669s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:51:49 np0005603500 nova_compute[182934]: 2026-01-31 06:51:49.300 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:50 np0005603500 nova_compute[182934]: 2026-01-31 06:51:50.460 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:52 np0005603500 nova_compute[182934]: 2026-01-31 06:51:52.493 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:51:52 np0005603500 nova_compute[182934]: 2026-01-31 06:51:52.494 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:51:53 np0005603500 nova_compute[182934]: 2026-01-31 06:51:53.027 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:51:53 np0005603500 nova_compute[182934]: 2026-01-31 06:51:53.028 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:51:53 np0005603500 nova_compute[182934]: 2026-01-31 06:51:53.028 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:51:53 np0005603500 nova_compute[182934]: 2026-01-31 06:51:53.028 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:51:54 np0005603500 podman[223599]: 2026-01-31 06:51:54.134493192 +0000 UTC m=+0.055112488 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, version=9.7, build-date=2026-01-22T05:09:47Z, vcs-type=git, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64)
Jan 31 01:51:54 np0005603500 podman[223598]: 2026-01-31 06:51:54.151399018 +0000 UTC m=+0.071496118 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 31 01:51:54 np0005603500 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 31 01:51:54 np0005603500 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 31 01:51:54 np0005603500 nova_compute[182934]: 2026-01-31 06:51:54.301 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:55 np0005603500 nova_compute[182934]: 2026-01-31 06:51:55.462 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:51:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:51:56.502 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:51:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:51:56.503 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:51:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:51:56.503 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:51:59 np0005603500 nova_compute[182934]: 2026-01-31 06:51:59.303 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:00 np0005603500 nova_compute[182934]: 2026-01-31 06:52:00.464 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:01 np0005603500 podman[223648]: 2026-01-31 06:52:01.124724905 +0000 UTC m=+0.045486353 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:52:01 np0005603500 podman[223647]: 2026-01-31 06:52:01.129279499 +0000 UTC m=+0.052074301 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 01:52:04 np0005603500 nova_compute[182934]: 2026-01-31 06:52:04.305 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:05 np0005603500 nova_compute[182934]: 2026-01-31 06:52:05.466 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:09 np0005603500 nova_compute[182934]: 2026-01-31 06:52:09.307 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:10 np0005603500 nova_compute[182934]: 2026-01-31 06:52:10.500 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:14 np0005603500 nova_compute[182934]: 2026-01-31 06:52:14.307 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:15 np0005603500 podman[223692]: 2026-01-31 06:52:15.119377163 +0000 UTC m=+0.036784917 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 31 01:52:15 np0005603500 podman[223691]: 2026-01-31 06:52:15.124111453 +0000 UTC m=+0.042983233 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 31 01:52:15 np0005603500 nova_compute[182934]: 2026-01-31 06:52:15.506 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.985 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f19a53f3b80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f199f451250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f199f43b0d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f199f43b550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f199f43b700>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f199f43bbe0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f199f44dcd0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f199f44d040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f199f44d4f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f199f44d6a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f199f44d760>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f199f44dc10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f199f44d160>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f199f43bca0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f199f44d940>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f199f43b490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f199f44db50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f199f43baf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f199f43b340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f199f43bdf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f199f44d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f199f436bb0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f199f44d3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f199f44d2e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f199f43b3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.992 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.992 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f199f44d220>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:52:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:52:17.992 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:52:19 np0005603500 nova_compute[182934]: 2026-01-31 06:52:19.324 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:20 np0005603500 nova_compute[182934]: 2026-01-31 06:52:20.508 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:24 np0005603500 nova_compute[182934]: 2026-01-31 06:52:24.326 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:25 np0005603500 podman[223734]: 2026-01-31 06:52:25.124365285 +0000 UTC m=+0.042731595 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 31 01:52:25 np0005603500 podman[223733]: 2026-01-31 06:52:25.144311188 +0000 UTC m=+0.065591700 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 01:52:25 np0005603500 nova_compute[182934]: 2026-01-31 06:52:25.510 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:29 np0005603500 nova_compute[182934]: 2026-01-31 06:52:29.328 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:30 np0005603500 nova_compute[182934]: 2026-01-31 06:52:30.511 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:32 np0005603500 podman[223777]: 2026-01-31 06:52:32.123426259 +0000 UTC m=+0.046770974 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 01:52:32 np0005603500 podman[223778]: 2026-01-31 06:52:32.145593992 +0000 UTC m=+0.067344027 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:52:34 np0005603500 nova_compute[182934]: 2026-01-31 06:52:34.329 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:35 np0005603500 nova_compute[182934]: 2026-01-31 06:52:35.512 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:39 np0005603500 nova_compute[182934]: 2026-01-31 06:52:39.330 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:40 np0005603500 nova_compute[182934]: 2026-01-31 06:52:40.513 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:44 np0005603500 nova_compute[182934]: 2026-01-31 06:52:44.148 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:52:44 np0005603500 nova_compute[182934]: 2026-01-31 06:52:44.332 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:45 np0005603500 nova_compute[182934]: 2026-01-31 06:52:45.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:52:45 np0005603500 nova_compute[182934]: 2026-01-31 06:52:45.148 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:52:45 np0005603500 nova_compute[182934]: 2026-01-31 06:52:45.515 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:45 np0005603500 nova_compute[182934]: 2026-01-31 06:52:45.672 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:52:45 np0005603500 nova_compute[182934]: 2026-01-31 06:52:45.672 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:52:45 np0005603500 nova_compute[182934]: 2026-01-31 06:52:45.672 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:52:45 np0005603500 nova_compute[182934]: 2026-01-31 06:52:45.672 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:52:45 np0005603500 nova_compute[182934]: 2026-01-31 06:52:45.807 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:52:45 np0005603500 nova_compute[182934]: 2026-01-31 06:52:45.808 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5701MB free_disk=73.21134948730469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:52:45 np0005603500 nova_compute[182934]: 2026-01-31 06:52:45.808 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:52:45 np0005603500 nova_compute[182934]: 2026-01-31 06:52:45.808 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:52:46 np0005603500 podman[223818]: 2026-01-31 06:52:46.116366402 +0000 UTC m=+0.036825298 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:52:46 np0005603500 podman[223817]: 2026-01-31 06:52:46.121207906 +0000 UTC m=+0.044323347 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 31 01:52:46 np0005603500 nova_compute[182934]: 2026-01-31 06:52:46.896 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:52:46 np0005603500 nova_compute[182934]: 2026-01-31 06:52:46.897 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:52:46 np0005603500 nova_compute[182934]: 2026-01-31 06:52:46.937 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:52:47 np0005603500 nova_compute[182934]: 2026-01-31 06:52:47.451 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:52:47 np0005603500 nova_compute[182934]: 2026-01-31 06:52:47.452 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:52:47 np0005603500 nova_compute[182934]: 2026-01-31 06:52:47.453 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:52:49 np0005603500 nova_compute[182934]: 2026-01-31 06:52:49.334 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:49 np0005603500 nova_compute[182934]: 2026-01-31 06:52:49.452 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:52:50 np0005603500 nova_compute[182934]: 2026-01-31 06:52:50.517 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:51 np0005603500 nova_compute[182934]: 2026-01-31 06:52:51.143 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:52:51 np0005603500 nova_compute[182934]: 2026-01-31 06:52:51.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:52:51 np0005603500 nova_compute[182934]: 2026-01-31 06:52:51.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:52:51 np0005603500 nova_compute[182934]: 2026-01-31 06:52:51.147 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:52:52 np0005603500 nova_compute[182934]: 2026-01-31 06:52:52.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:52:54 np0005603500 nova_compute[182934]: 2026-01-31 06:52:54.335 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:55 np0005603500 nova_compute[182934]: 2026-01-31 06:52:55.518 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:52:55 np0005603500 podman[223861]: 2026-01-31 06:52:55.594342717 +0000 UTC m=+0.047043623 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1769056855, distribution-scope=public, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, name=ubi9/ubi-minimal, io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 31 01:52:55 np0005603500 podman[223860]: 2026-01-31 06:52:55.609156356 +0000 UTC m=+0.063234666 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 31 01:52:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:52:56.544 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:52:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:52:56.545 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:52:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:52:56.545 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:52:59 np0005603500 nova_compute[182934]: 2026-01-31 06:52:59.338 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:00 np0005603500 nova_compute[182934]: 2026-01-31 06:53:00.521 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:03 np0005603500 podman[223906]: 2026-01-31 06:53:03.133337299 +0000 UTC m=+0.049075248 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 31 01:53:03 np0005603500 podman[223905]: 2026-01-31 06:53:03.137553362 +0000 UTC m=+0.055834561 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127)
Jan 31 01:53:04 np0005603500 nova_compute[182934]: 2026-01-31 06:53:04.339 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:05 np0005603500 nova_compute[182934]: 2026-01-31 06:53:05.524 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:09 np0005603500 nova_compute[182934]: 2026-01-31 06:53:09.341 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:10 np0005603500 nova_compute[182934]: 2026-01-31 06:53:10.526 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:14 np0005603500 nova_compute[182934]: 2026-01-31 06:53:14.363 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:15 np0005603500 nova_compute[182934]: 2026-01-31 06:53:15.527 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:17 np0005603500 podman[223944]: 2026-01-31 06:53:17.133379667 +0000 UTC m=+0.051668479 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:53:17 np0005603500 podman[223945]: 2026-01-31 06:53:17.161328603 +0000 UTC m=+0.071366914 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 01:53:19 np0005603500 nova_compute[182934]: 2026-01-31 06:53:19.364 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:20 np0005603500 nova_compute[182934]: 2026-01-31 06:53:20.528 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:24 np0005603500 nova_compute[182934]: 2026-01-31 06:53:24.365 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:25 np0005603500 nova_compute[182934]: 2026-01-31 06:53:25.530 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:26 np0005603500 podman[223989]: 2026-01-31 06:53:26.136757165 +0000 UTC m=+0.049670845 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, release=1769056855, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc.)
Jan 31 01:53:26 np0005603500 podman[223988]: 2026-01-31 06:53:26.156799291 +0000 UTC m=+0.072292253 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 01:53:29 np0005603500 nova_compute[182934]: 2026-01-31 06:53:29.406 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:30 np0005603500 nova_compute[182934]: 2026-01-31 06:53:30.552 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:34 np0005603500 podman[224036]: 2026-01-31 06:53:34.128840221 +0000 UTC m=+0.050548624 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 31 01:53:34 np0005603500 podman[224037]: 2026-01-31 06:53:34.129724069 +0000 UTC m=+0.048113717 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 31 01:53:34 np0005603500 nova_compute[182934]: 2026-01-31 06:53:34.406 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:35 np0005603500 nova_compute[182934]: 2026-01-31 06:53:35.554 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:39 np0005603500 nova_compute[182934]: 2026-01-31 06:53:39.450 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:40 np0005603500 nova_compute[182934]: 2026-01-31 06:53:40.587 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:44 np0005603500 nova_compute[182934]: 2026-01-31 06:53:44.451 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:45 np0005603500 nova_compute[182934]: 2026-01-31 06:53:45.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:53:45 np0005603500 nova_compute[182934]: 2026-01-31 06:53:45.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:53:45 np0005603500 nova_compute[182934]: 2026-01-31 06:53:45.588 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:45 np0005603500 nova_compute[182934]: 2026-01-31 06:53:45.667 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:53:45 np0005603500 nova_compute[182934]: 2026-01-31 06:53:45.667 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:53:45 np0005603500 nova_compute[182934]: 2026-01-31 06:53:45.667 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:53:45 np0005603500 nova_compute[182934]: 2026-01-31 06:53:45.668 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:53:45 np0005603500 nova_compute[182934]: 2026-01-31 06:53:45.790 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:53:45 np0005603500 nova_compute[182934]: 2026-01-31 06:53:45.791 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5729MB free_disk=73.21134948730469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:53:45 np0005603500 nova_compute[182934]: 2026-01-31 06:53:45.791 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:53:45 np0005603500 nova_compute[182934]: 2026-01-31 06:53:45.792 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:53:46 np0005603500 nova_compute[182934]: 2026-01-31 06:53:46.833 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:53:46 np0005603500 nova_compute[182934]: 2026-01-31 06:53:46.834 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:53:46 np0005603500 nova_compute[182934]: 2026-01-31 06:53:46.859 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:53:47 np0005603500 nova_compute[182934]: 2026-01-31 06:53:47.366 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:53:47 np0005603500 nova_compute[182934]: 2026-01-31 06:53:47.368 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:53:47 np0005603500 nova_compute[182934]: 2026-01-31 06:53:47.368 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.576s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:53:48 np0005603500 podman[224080]: 2026-01-31 06:53:48.120153368 +0000 UTC m=+0.038171543 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 31 01:53:48 np0005603500 podman[224081]: 2026-01-31 06:53:48.124846198 +0000 UTC m=+0.039508487 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:53:48 np0005603500 nova_compute[182934]: 2026-01-31 06:53:48.367 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:53:48 np0005603500 nova_compute[182934]: 2026-01-31 06:53:48.367 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:53:49 np0005603500 nova_compute[182934]: 2026-01-31 06:53:49.453 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:50 np0005603500 nova_compute[182934]: 2026-01-31 06:53:50.591 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:51 np0005603500 nova_compute[182934]: 2026-01-31 06:53:51.143 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:53:53 np0005603500 nova_compute[182934]: 2026-01-31 06:53:53.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:53:53 np0005603500 nova_compute[182934]: 2026-01-31 06:53:53.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:53:53 np0005603500 nova_compute[182934]: 2026-01-31 06:53:53.147 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:53:54 np0005603500 nova_compute[182934]: 2026-01-31 06:53:54.143 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:53:54 np0005603500 nova_compute[182934]: 2026-01-31 06:53:54.454 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:54 np0005603500 nova_compute[182934]: 2026-01-31 06:53:54.659 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:53:55 np0005603500 nova_compute[182934]: 2026-01-31 06:53:55.593 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:53:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:53:56.573 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:53:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:53:56.574 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:53:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:53:56.574 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:53:57 np0005603500 podman[224123]: 2026-01-31 06:53:57.134541304 +0000 UTC m=+0.051048853 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9)
Jan 31 01:53:57 np0005603500 podman[224122]: 2026-01-31 06:53:57.155411456 +0000 UTC m=+0.073938669 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 31 01:53:59 np0005603500 nova_compute[182934]: 2026-01-31 06:53:59.455 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:00 np0005603500 nova_compute[182934]: 2026-01-31 06:54:00.595 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:04 np0005603500 nova_compute[182934]: 2026-01-31 06:54:04.457 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:05 np0005603500 podman[224168]: 2026-01-31 06:54:05.127387708 +0000 UTC m=+0.040933192 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:54:05 np0005603500 podman[224167]: 2026-01-31 06:54:05.127376067 +0000 UTC m=+0.044791783 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:54:05 np0005603500 nova_compute[182934]: 2026-01-31 06:54:05.597 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:09 np0005603500 nova_compute[182934]: 2026-01-31 06:54:09.459 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:10 np0005603500 nova_compute[182934]: 2026-01-31 06:54:10.599 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:14 np0005603500 nova_compute[182934]: 2026-01-31 06:54:14.461 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:15 np0005603500 nova_compute[182934]: 2026-01-31 06:54:15.601 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f199f43bdf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f199f43bca0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f199f44d2e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f199f43baf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f199f44db50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f199f44d4f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f199f43bbe0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f199f44dcd0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f199f451250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f199f44d160>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f199f43b700>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f199f436bb0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f199f44d040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f199f43b550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f199f44d6a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f199f43b340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f199f43b3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f199f44d760>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f199f44d220>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f199f44d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f199f43b490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f199f43b0d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f19a53f3b80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f199f44d3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f199f44d940>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.992 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.992 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f199f44dc10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:54:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:54:17.992 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:54:19 np0005603500 podman[224211]: 2026-01-31 06:54:19.11639175 +0000 UTC m=+0.039677452 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 31 01:54:19 np0005603500 podman[224212]: 2026-01-31 06:54:19.117512815 +0000 UTC m=+0.037263084 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 01:54:19 np0005603500 nova_compute[182934]: 2026-01-31 06:54:19.464 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:20 np0005603500 nova_compute[182934]: 2026-01-31 06:54:20.603 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:24 np0005603500 nova_compute[182934]: 2026-01-31 06:54:24.466 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:25 np0005603500 nova_compute[182934]: 2026-01-31 06:54:25.605 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:28 np0005603500 podman[224252]: 2026-01-31 06:54:28.13529461 +0000 UTC m=+0.051532577 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, name=ubi9/ubi-minimal, release=1769056855, vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc.)
Jan 31 01:54:28 np0005603500 podman[224251]: 2026-01-31 06:54:28.144335718 +0000 UTC m=+0.067286469 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 01:54:29 np0005603500 nova_compute[182934]: 2026-01-31 06:54:29.466 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:30 np0005603500 nova_compute[182934]: 2026-01-31 06:54:30.607 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:34 np0005603500 nova_compute[182934]: 2026-01-31 06:54:34.469 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:35 np0005603500 nova_compute[182934]: 2026-01-31 06:54:35.609 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:36 np0005603500 podman[224297]: 2026-01-31 06:54:36.1352026 +0000 UTC m=+0.046336603 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:54:36 np0005603500 podman[224298]: 2026-01-31 06:54:36.135538141 +0000 UTC m=+0.042278454 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 31 01:54:39 np0005603500 nova_compute[182934]: 2026-01-31 06:54:39.471 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:40 np0005603500 nova_compute[182934]: 2026-01-31 06:54:40.610 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:44 np0005603500 nova_compute[182934]: 2026-01-31 06:54:44.473 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:45 np0005603500 nova_compute[182934]: 2026-01-31 06:54:45.611 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:46 np0005603500 nova_compute[182934]: 2026-01-31 06:54:46.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:54:46 np0005603500 nova_compute[182934]: 2026-01-31 06:54:46.671 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:54:46 np0005603500 nova_compute[182934]: 2026-01-31 06:54:46.671 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:54:46 np0005603500 nova_compute[182934]: 2026-01-31 06:54:46.671 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:54:46 np0005603500 nova_compute[182934]: 2026-01-31 06:54:46.672 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:54:46 np0005603500 nova_compute[182934]: 2026-01-31 06:54:46.789 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:54:46 np0005603500 nova_compute[182934]: 2026-01-31 06:54:46.791 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5730MB free_disk=73.21133041381836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:54:46 np0005603500 nova_compute[182934]: 2026-01-31 06:54:46.791 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:54:46 np0005603500 nova_compute[182934]: 2026-01-31 06:54:46.791 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:54:47 np0005603500 nova_compute[182934]: 2026-01-31 06:54:47.836 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:54:47 np0005603500 nova_compute[182934]: 2026-01-31 06:54:47.836 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:54:47 np0005603500 nova_compute[182934]: 2026-01-31 06:54:47.864 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:54:48 np0005603500 nova_compute[182934]: 2026-01-31 06:54:48.371 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:54:48 np0005603500 nova_compute[182934]: 2026-01-31 06:54:48.372 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:54:48 np0005603500 nova_compute[182934]: 2026-01-31 06:54:48.372 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:54:49 np0005603500 nova_compute[182934]: 2026-01-31 06:54:49.373 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:54:49 np0005603500 nova_compute[182934]: 2026-01-31 06:54:49.373 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:54:49 np0005603500 nova_compute[182934]: 2026-01-31 06:54:49.373 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:54:49 np0005603500 nova_compute[182934]: 2026-01-31 06:54:49.475 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:50 np0005603500 podman[224341]: 2026-01-31 06:54:50.128131207 +0000 UTC m=+0.040777096 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 31 01:54:50 np0005603500 podman[224342]: 2026-01-31 06:54:50.128362625 +0000 UTC m=+0.038642439 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:54:50 np0005603500 nova_compute[182934]: 2026-01-31 06:54:50.613 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:51 np0005603500 nova_compute[182934]: 2026-01-31 06:54:51.145 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:54:53 np0005603500 nova_compute[182934]: 2026-01-31 06:54:53.150 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:54:54 np0005603500 nova_compute[182934]: 2026-01-31 06:54:54.154 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:54:54 np0005603500 nova_compute[182934]: 2026-01-31 06:54:54.477 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:55 np0005603500 nova_compute[182934]: 2026-01-31 06:54:55.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:54:55 np0005603500 nova_compute[182934]: 2026-01-31 06:54:55.147 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:54:55 np0005603500 nova_compute[182934]: 2026-01-31 06:54:55.616 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:54:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:54:56.639 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:54:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:54:56.639 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:54:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:54:56.639 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:54:59 np0005603500 podman[224387]: 2026-01-31 06:54:59.140232291 +0000 UTC m=+0.052802689 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-type=git, config_id=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, version=9.7, build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z)
Jan 31 01:54:59 np0005603500 podman[224386]: 2026-01-31 06:54:59.174274262 +0000 UTC m=+0.093864712 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Jan 31 01:54:59 np0005603500 nova_compute[182934]: 2026-01-31 06:54:59.478 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:00 np0005603500 nova_compute[182934]: 2026-01-31 06:55:00.618 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:04 np0005603500 nova_compute[182934]: 2026-01-31 06:55:04.481 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:05 np0005603500 nova_compute[182934]: 2026-01-31 06:55:05.620 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:07 np0005603500 podman[224432]: 2026-01-31 06:55:07.122453038 +0000 UTC m=+0.043374658 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 31 01:55:07 np0005603500 podman[224431]: 2026-01-31 06:55:07.15524806 +0000 UTC m=+0.078692941 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute)
Jan 31 01:55:09 np0005603500 nova_compute[182934]: 2026-01-31 06:55:09.485 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:10 np0005603500 nova_compute[182934]: 2026-01-31 06:55:10.621 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:11 np0005603500 nova_compute[182934]: 2026-01-31 06:55:11.952 182938 DEBUG oslo_concurrency.processutils [None req-7654c26b-97b8-4b44-9ec0-21984c53737d 6877c6eb4b144b98803bc5d2fcaf84e1 258ad37c7a194c8cb9fd805ff19f8fe0 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Jan 31 01:55:11 np0005603500 nova_compute[182934]: 2026-01-31 06:55:11.968 182938 DEBUG oslo_concurrency.processutils [None req-7654c26b-97b8-4b44-9ec0-21984c53737d 6877c6eb4b144b98803bc5d2fcaf84e1 258ad37c7a194c8cb9fd805ff19f8fe0 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Jan 31 01:55:14 np0005603500 nova_compute[182934]: 2026-01-31 06:55:14.485 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:15 np0005603500 nova_compute[182934]: 2026-01-31 06:55:15.623 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:55:18.844 104644 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '6e:a1:7b', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '8a:a5:ae:02:04:7e'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 31 01:55:18 np0005603500 nova_compute[182934]: 2026-01-31 06:55:18.844 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:55:18.845 104644 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 31 01:55:18 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:55:18.846 104644 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fe203bcd-9b71-4c38-9736-f063b4ce4137, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 31 01:55:19 np0005603500 nova_compute[182934]: 2026-01-31 06:55:19.487 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:20 np0005603500 nova_compute[182934]: 2026-01-31 06:55:20.624 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:21 np0005603500 podman[224477]: 2026-01-31 06:55:21.11523663 +0000 UTC m=+0.035556810 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 01:55:21 np0005603500 podman[224476]: 2026-01-31 06:55:21.115187489 +0000 UTC m=+0.040015613 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 31 01:55:24 np0005603500 nova_compute[182934]: 2026-01-31 06:55:24.489 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:25 np0005603500 nova_compute[182934]: 2026-01-31 06:55:25.626 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:29 np0005603500 nova_compute[182934]: 2026-01-31 06:55:29.492 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:30 np0005603500 podman[224519]: 2026-01-31 06:55:30.12630413 +0000 UTC m=+0.046867769 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, release=1769056855, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64)
Jan 31 01:55:30 np0005603500 podman[224518]: 2026-01-31 06:55:30.143187947 +0000 UTC m=+0.065836232 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller)
Jan 31 01:55:30 np0005603500 nova_compute[182934]: 2026-01-31 06:55:30.629 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:34 np0005603500 nova_compute[182934]: 2026-01-31 06:55:34.494 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:35 np0005603500 nova_compute[182934]: 2026-01-31 06:55:35.631 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:38 np0005603500 podman[224567]: 2026-01-31 06:55:38.124664911 +0000 UTC m=+0.038345799 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 31 01:55:38 np0005603500 podman[224566]: 2026-01-31 06:55:38.125628782 +0000 UTC m=+0.042006866 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 01:55:39 np0005603500 nova_compute[182934]: 2026-01-31 06:55:39.496 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:40 np0005603500 nova_compute[182934]: 2026-01-31 06:55:40.634 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:44 np0005603500 nova_compute[182934]: 2026-01-31 06:55:44.496 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:45 np0005603500 nova_compute[182934]: 2026-01-31 06:55:45.636 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:46 np0005603500 nova_compute[182934]: 2026-01-31 06:55:46.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:55:46 np0005603500 nova_compute[182934]: 2026-01-31 06:55:46.699 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:55:46 np0005603500 nova_compute[182934]: 2026-01-31 06:55:46.700 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:55:46 np0005603500 nova_compute[182934]: 2026-01-31 06:55:46.700 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:55:46 np0005603500 nova_compute[182934]: 2026-01-31 06:55:46.701 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:55:46 np0005603500 nova_compute[182934]: 2026-01-31 06:55:46.812 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:55:46 np0005603500 nova_compute[182934]: 2026-01-31 06:55:46.813 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5753MB free_disk=73.20913314819336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:55:46 np0005603500 nova_compute[182934]: 2026-01-31 06:55:46.813 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:55:46 np0005603500 nova_compute[182934]: 2026-01-31 06:55:46.813 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:55:47 np0005603500 nova_compute[182934]: 2026-01-31 06:55:47.923 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:55:47 np0005603500 nova_compute[182934]: 2026-01-31 06:55:47.923 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:55:47 np0005603500 nova_compute[182934]: 2026-01-31 06:55:47.943 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:55:48 np0005603500 nova_compute[182934]: 2026-01-31 06:55:48.464 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:55:48 np0005603500 nova_compute[182934]: 2026-01-31 06:55:48.466 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:55:48 np0005603500 nova_compute[182934]: 2026-01-31 06:55:48.466 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:55:48 np0005603500 nova_compute[182934]: 2026-01-31 06:55:48.467 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:55:49 np0005603500 nova_compute[182934]: 2026-01-31 06:55:49.498 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:50 np0005603500 nova_compute[182934]: 2026-01-31 06:55:50.055 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:55:50 np0005603500 nova_compute[182934]: 2026-01-31 06:55:50.055 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:55:50 np0005603500 nova_compute[182934]: 2026-01-31 06:55:50.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:55:50 np0005603500 nova_compute[182934]: 2026-01-31 06:55:50.638 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:52 np0005603500 podman[224611]: 2026-01-31 06:55:52.123504724 +0000 UTC m=+0.045250838 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:55:52 np0005603500 podman[224612]: 2026-01-31 06:55:52.127535652 +0000 UTC m=+0.044946559 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 01:55:52 np0005603500 nova_compute[182934]: 2026-01-31 06:55:52.143 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:55:54 np0005603500 nova_compute[182934]: 2026-01-31 06:55:54.498 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:55 np0005603500 nova_compute[182934]: 2026-01-31 06:55:55.142 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:55:55 np0005603500 nova_compute[182934]: 2026-01-31 06:55:55.640 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:55 np0005603500 nova_compute[182934]: 2026-01-31 06:55:55.659 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:55:55 np0005603500 nova_compute[182934]: 2026-01-31 06:55:55.660 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:55:56 np0005603500 nova_compute[182934]: 2026-01-31 06:55:56.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:55:56 np0005603500 nova_compute[182934]: 2026-01-31 06:55:56.147 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:55:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:55:56.697 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:55:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:55:56.698 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:55:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:55:56.698 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:55:59 np0005603500 nova_compute[182934]: 2026-01-31 06:55:59.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:55:59 np0005603500 nova_compute[182934]: 2026-01-31 06:55:59.148 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11834
Jan 31 01:55:59 np0005603500 nova_compute[182934]: 2026-01-31 06:55:59.587 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:55:59 np0005603500 nova_compute[182934]: 2026-01-31 06:55:59.710 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11843
Jan 31 01:56:00 np0005603500 nova_compute[182934]: 2026-01-31 06:56:00.642 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:01 np0005603500 podman[224657]: 2026-01-31 06:56:01.123099619 +0000 UTC m=+0.043204943 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.7, architecture=x86_64, vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, release=1769056855, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 31 01:56:01 np0005603500 podman[224656]: 2026-01-31 06:56:01.142337 +0000 UTC m=+0.065680647 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Jan 31 01:56:04 np0005603500 nova_compute[182934]: 2026-01-31 06:56:04.589 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:05 np0005603500 nova_compute[182934]: 2026-01-31 06:56:05.644 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:09 np0005603500 podman[224704]: 2026-01-31 06:56:09.126777448 +0000 UTC m=+0.049469162 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ceilometer_agent_compute)
Jan 31 01:56:09 np0005603500 podman[224705]: 2026-01-31 06:56:09.127506452 +0000 UTC m=+0.045135565 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 31 01:56:09 np0005603500 nova_compute[182934]: 2026-01-31 06:56:09.590 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:10 np0005603500 nova_compute[182934]: 2026-01-31 06:56:10.645 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:14 np0005603500 nova_compute[182934]: 2026-01-31 06:56:14.592 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:15 np0005603500 nova_compute[182934]: 2026-01-31 06:56:15.647 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.986 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f199f44dc10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f199f44d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.987 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f199f44d760>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f199f44d160>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f199f44dcd0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f199f44db50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f199f44d940>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f199f43b340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f199f43baf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f199f451250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f199f43b490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f199f43bdf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f199f436bb0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f199f43b0d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f199f43b550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f199f43bca0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f199f44d4f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f199f43b700>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f199f44d2e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f199f44d3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f19a53f3b80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f199f44d220>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f199f44d6a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f199f44d040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f199f43b3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f199f43bbe0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:56:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:56:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:56:18 np0005603500 nova_compute[182934]: 2026-01-31 06:56:18.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:56:18 np0005603500 nova_compute[182934]: 2026-01-31 06:56:18.148 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11872
Jan 31 01:56:19 np0005603500 nova_compute[182934]: 2026-01-31 06:56:19.594 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:20 np0005603500 nova_compute[182934]: 2026-01-31 06:56:20.648 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:23 np0005603500 podman[224749]: 2026-01-31 06:56:23.126462641 +0000 UTC m=+0.041057616 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 31 01:56:23 np0005603500 podman[224748]: 2026-01-31 06:56:23.127157252 +0000 UTC m=+0.046101705 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:56:24 np0005603500 nova_compute[182934]: 2026-01-31 06:56:24.595 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:25 np0005603500 nova_compute[182934]: 2026-01-31 06:56:25.649 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:28 np0005603500 nova_compute[182934]: 2026-01-31 06:56:28.324 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:56:29 np0005603500 nova_compute[182934]: 2026-01-31 06:56:29.596 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:30 np0005603500 nova_compute[182934]: 2026-01-31 06:56:30.651 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:32 np0005603500 podman[224794]: 2026-01-31 06:56:32.12934112 +0000 UTC m=+0.048062198 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, config_id=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., distribution-scope=public, vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9/ubi-minimal)
Jan 31 01:56:32 np0005603500 podman[224793]: 2026-01-31 06:56:32.143422847 +0000 UTC m=+0.065734389 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 31 01:56:34 np0005603500 nova_compute[182934]: 2026-01-31 06:56:34.598 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:35 np0005603500 nova_compute[182934]: 2026-01-31 06:56:35.652 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:39 np0005603500 nova_compute[182934]: 2026-01-31 06:56:39.600 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:40 np0005603500 podman[224839]: 2026-01-31 06:56:40.125393336 +0000 UTC m=+0.040394374 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:56:40 np0005603500 podman[224838]: 2026-01-31 06:56:40.125386356 +0000 UTC m=+0.044228457 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 31 01:56:40 np0005603500 nova_compute[182934]: 2026-01-31 06:56:40.654 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:44 np0005603500 nova_compute[182934]: 2026-01-31 06:56:44.603 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:45 np0005603500 nova_compute[182934]: 2026-01-31 06:56:45.656 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:47 np0005603500 nova_compute[182934]: 2026-01-31 06:56:47.148 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:56:47 np0005603500 nova_compute[182934]: 2026-01-31 06:56:47.149 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:56:47 np0005603500 nova_compute[182934]: 2026-01-31 06:56:47.665 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:56:47 np0005603500 nova_compute[182934]: 2026-01-31 06:56:47.665 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:56:47 np0005603500 nova_compute[182934]: 2026-01-31 06:56:47.666 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:56:47 np0005603500 nova_compute[182934]: 2026-01-31 06:56:47.666 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:56:47 np0005603500 nova_compute[182934]: 2026-01-31 06:56:47.791 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:56:47 np0005603500 nova_compute[182934]: 2026-01-31 06:56:47.792 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5746MB free_disk=73.20911407470703GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:56:47 np0005603500 nova_compute[182934]: 2026-01-31 06:56:47.792 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:56:47 np0005603500 nova_compute[182934]: 2026-01-31 06:56:47.792 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:56:49 np0005603500 nova_compute[182934]: 2026-01-31 06:56:49.154 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:56:49 np0005603500 nova_compute[182934]: 2026-01-31 06:56:49.155 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:56:49 np0005603500 nova_compute[182934]: 2026-01-31 06:56:49.494 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Refreshing inventories for resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:822
Jan 31 01:56:49 np0005603500 nova_compute[182934]: 2026-01-31 06:56:49.605 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:50 np0005603500 nova_compute[182934]: 2026-01-31 06:56:50.081 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Updating ProviderTree inventory for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:786
Jan 31 01:56:50 np0005603500 nova_compute[182934]: 2026-01-31 06:56:50.081 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Updating inventory in ProviderTree for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 31 01:56:50 np0005603500 nova_compute[182934]: 2026-01-31 06:56:50.097 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Refreshing aggregate associations for resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:831
Jan 31 01:56:50 np0005603500 nova_compute[182934]: 2026-01-31 06:56:50.120 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Refreshing trait associations for resource provider b70e363b-8d1d-4e70-9fa4-9b0009536a59, traits: COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,HW_CPU_X86_BMI2,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,HW_ARCH_X86_64,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_ABM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_CRB,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_SHA,HW_CPU_X86_CLMUL,HW_CPU_X86_AVX,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_TIS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_FMA3,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SVM,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:843
Jan 31 01:56:50 np0005603500 nova_compute[182934]: 2026-01-31 06:56:50.138 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:56:50 np0005603500 nova_compute[182934]: 2026-01-31 06:56:50.646 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:56:50 np0005603500 nova_compute[182934]: 2026-01-31 06:56:50.648 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:56:50 np0005603500 nova_compute[182934]: 2026-01-31 06:56:50.648 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.856s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:56:50 np0005603500 nova_compute[182934]: 2026-01-31 06:56:50.658 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:52 np0005603500 nova_compute[182934]: 2026-01-31 06:56:52.646 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:56:52 np0005603500 nova_compute[182934]: 2026-01-31 06:56:52.647 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:56:52 np0005603500 nova_compute[182934]: 2026-01-31 06:56:52.647 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:56:54 np0005603500 podman[224880]: 2026-01-31 06:56:54.126496723 +0000 UTC m=+0.044176135 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:56:54 np0005603500 podman[224881]: 2026-01-31 06:56:54.14841864 +0000 UTC m=+0.063207970 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 01:56:54 np0005603500 nova_compute[182934]: 2026-01-31 06:56:54.608 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:55 np0005603500 nova_compute[182934]: 2026-01-31 06:56:55.659 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:56:56 np0005603500 nova_compute[182934]: 2026-01-31 06:56:56.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:56:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:56:56.715 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:56:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:56:56.716 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:56:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:56:56.716 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:56:57 np0005603500 nova_compute[182934]: 2026-01-31 06:56:57.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:56:57 np0005603500 nova_compute[182934]: 2026-01-31 06:56:57.148 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:56:57 np0005603500 nova_compute[182934]: 2026-01-31 06:56:57.148 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:56:59 np0005603500 nova_compute[182934]: 2026-01-31 06:56:59.609 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:00 np0005603500 nova_compute[182934]: 2026-01-31 06:57:00.661 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:03 np0005603500 podman[224925]: 2026-01-31 06:57:03.171496363 +0000 UTC m=+0.086208961 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, release=1769056855, version=9.7, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, managed_by=edpm_ansible)
Jan 31 01:57:03 np0005603500 podman[224924]: 2026-01-31 06:57:03.186956563 +0000 UTC m=+0.105272985 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, container_name=ovn_controller)
Jan 31 01:57:04 np0005603500 nova_compute[182934]: 2026-01-31 06:57:04.613 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:05 np0005603500 nova_compute[182934]: 2026-01-31 06:57:05.663 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:07 np0005603500 nova_compute[182934]: 2026-01-31 06:57:07.674 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:57:09 np0005603500 nova_compute[182934]: 2026-01-31 06:57:09.614 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:10 np0005603500 nova_compute[182934]: 2026-01-31 06:57:10.666 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:11 np0005603500 podman[224969]: 2026-01-31 06:57:11.139999393 +0000 UTC m=+0.058046724 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 31 01:57:11 np0005603500 podman[224970]: 2026-01-31 06:57:11.161364243 +0000 UTC m=+0.075929993 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 31 01:57:14 np0005603500 nova_compute[182934]: 2026-01-31 06:57:14.616 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:15 np0005603500 nova_compute[182934]: 2026-01-31 06:57:15.667 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:19 np0005603500 nova_compute[182934]: 2026-01-31 06:57:19.619 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:20 np0005603500 nova_compute[182934]: 2026-01-31 06:57:20.670 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:24 np0005603500 nova_compute[182934]: 2026-01-31 06:57:24.621 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:25 np0005603500 podman[225012]: 2026-01-31 06:57:25.126277027 +0000 UTC m=+0.042859551 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 31 01:57:25 np0005603500 podman[225011]: 2026-01-31 06:57:25.150344883 +0000 UTC m=+0.069755517 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 31 01:57:25 np0005603500 nova_compute[182934]: 2026-01-31 06:57:25.672 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:29 np0005603500 nova_compute[182934]: 2026-01-31 06:57:29.623 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:30 np0005603500 nova_compute[182934]: 2026-01-31 06:57:30.673 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:34 np0005603500 podman[225055]: 2026-01-31 06:57:34.145360703 +0000 UTC m=+0.067177265 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 31 01:57:34 np0005603500 podman[225056]: 2026-01-31 06:57:34.154794312 +0000 UTC m=+0.074792017 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1769056855, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z)
Jan 31 01:57:34 np0005603500 nova_compute[182934]: 2026-01-31 06:57:34.624 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:35 np0005603500 nova_compute[182934]: 2026-01-31 06:57:35.674 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:39 np0005603500 nova_compute[182934]: 2026-01-31 06:57:39.625 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:40 np0005603500 nova_compute[182934]: 2026-01-31 06:57:40.675 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:42 np0005603500 podman[225099]: 2026-01-31 06:57:42.126437013 +0000 UTC m=+0.049354859 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 01:57:42 np0005603500 podman[225100]: 2026-01-31 06:57:42.12667019 +0000 UTC m=+0.047157909 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 31 01:57:44 np0005603500 nova_compute[182934]: 2026-01-31 06:57:44.627 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:45 np0005603500 nova_compute[182934]: 2026-01-31 06:57:45.677 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:49 np0005603500 nova_compute[182934]: 2026-01-31 06:57:49.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:57:49 np0005603500 nova_compute[182934]: 2026-01-31 06:57:49.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:57:49 np0005603500 nova_compute[182934]: 2026-01-31 06:57:49.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:57:49 np0005603500 nova_compute[182934]: 2026-01-31 06:57:49.628 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:49 np0005603500 nova_compute[182934]: 2026-01-31 06:57:49.667 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:57:49 np0005603500 nova_compute[182934]: 2026-01-31 06:57:49.667 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:57:49 np0005603500 nova_compute[182934]: 2026-01-31 06:57:49.667 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:57:49 np0005603500 nova_compute[182934]: 2026-01-31 06:57:49.667 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:57:49 np0005603500 nova_compute[182934]: 2026-01-31 06:57:49.803 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:57:49 np0005603500 nova_compute[182934]: 2026-01-31 06:57:49.804 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5747MB free_disk=73.20862579345703GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:57:49 np0005603500 nova_compute[182934]: 2026-01-31 06:57:49.804 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:57:49 np0005603500 nova_compute[182934]: 2026-01-31 06:57:49.805 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:57:50 np0005603500 nova_compute[182934]: 2026-01-31 06:57:50.679 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:50 np0005603500 nova_compute[182934]: 2026-01-31 06:57:50.844 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:57:50 np0005603500 nova_compute[182934]: 2026-01-31 06:57:50.844 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:57:50 np0005603500 nova_compute[182934]: 2026-01-31 06:57:50.863 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:57:51 np0005603500 nova_compute[182934]: 2026-01-31 06:57:51.368 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:57:51 np0005603500 nova_compute[182934]: 2026-01-31 06:57:51.370 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:57:51 np0005603500 nova_compute[182934]: 2026-01-31 06:57:51.370 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.566s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:57:54 np0005603500 nova_compute[182934]: 2026-01-31 06:57:54.371 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:57:54 np0005603500 nova_compute[182934]: 2026-01-31 06:57:54.371 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:57:54 np0005603500 nova_compute[182934]: 2026-01-31 06:57:54.630 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:55 np0005603500 nova_compute[182934]: 2026-01-31 06:57:55.681 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:57:56 np0005603500 podman[225142]: 2026-01-31 06:57:56.123194431 +0000 UTC m=+0.041124147 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 31 01:57:56 np0005603500 podman[225143]: 2026-01-31 06:57:56.124194713 +0000 UTC m=+0.039918290 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 01:57:56 np0005603500 nova_compute[182934]: 2026-01-31 06:57:56.142 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:57:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:57:56.738 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:57:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:57:56.738 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:57:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:57:56.738 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:57:58 np0005603500 nova_compute[182934]: 2026-01-31 06:57:58.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:57:59 np0005603500 nova_compute[182934]: 2026-01-31 06:57:59.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:57:59 np0005603500 nova_compute[182934]: 2026-01-31 06:57:59.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:57:59 np0005603500 nova_compute[182934]: 2026-01-31 06:57:59.147 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:57:59 np0005603500 nova_compute[182934]: 2026-01-31 06:57:59.630 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:00 np0005603500 nova_compute[182934]: 2026-01-31 06:58:00.682 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:04 np0005603500 nova_compute[182934]: 2026-01-31 06:58:04.635 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:05 np0005603500 podman[225185]: 2026-01-31 06:58:05.131262388 +0000 UTC m=+0.045992969 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, vcs-type=git, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64)
Jan 31 01:58:05 np0005603500 podman[225184]: 2026-01-31 06:58:05.168335046 +0000 UTC m=+0.086768043 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 31 01:58:05 np0005603500 nova_compute[182934]: 2026-01-31 06:58:05.684 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:09 np0005603500 nova_compute[182934]: 2026-01-31 06:58:09.636 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:10 np0005603500 nova_compute[182934]: 2026-01-31 06:58:10.686 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:13 np0005603500 podman[225229]: 2026-01-31 06:58:13.130534044 +0000 UTC m=+0.044200283 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 31 01:58:13 np0005603500 podman[225228]: 2026-01-31 06:58:13.149952906 +0000 UTC m=+0.073587838 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ceilometer_agent_compute)
Jan 31 01:58:14 np0005603500 nova_compute[182934]: 2026-01-31 06:58:14.637 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:15 np0005603500 nova_compute[182934]: 2026-01-31 06:58:15.688 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.987 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f199f43b3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.988 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f199f44d940>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f199f44dcd0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f199f43b550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f199f44d3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f199f44d6a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f199f43b0d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f199f44d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f19a53f3b80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f199f43b490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f199f43b700>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f199f43bdf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f199f44d4f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f199f44d2e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f199f44db50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f199f43bbe0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f199f44d760>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f199f44dc10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f199f44d160>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f199f436bb0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f199f43bca0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f199f44d220>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f199f43b340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.992 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.992 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f199f44d040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.992 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.992 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f199f43baf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.992 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.992 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f199f451250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 01:58:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 06:58:17.992 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 01:58:19 np0005603500 nova_compute[182934]: 2026-01-31 06:58:19.639 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:20 np0005603500 nova_compute[182934]: 2026-01-31 06:58:20.690 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:24 np0005603500 nova_compute[182934]: 2026-01-31 06:58:24.639 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:25 np0005603500 nova_compute[182934]: 2026-01-31 06:58:25.691 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:27 np0005603500 podman[225273]: 2026-01-31 06:58:27.129551026 +0000 UTC m=+0.045313507 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 01:58:27 np0005603500 podman[225272]: 2026-01-31 06:58:27.155352109 +0000 UTC m=+0.077611944 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 31 01:58:29 np0005603500 nova_compute[182934]: 2026-01-31 06:58:29.642 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:30 np0005603500 nova_compute[182934]: 2026-01-31 06:58:30.693 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:34 np0005603500 nova_compute[182934]: 2026-01-31 06:58:34.644 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:35 np0005603500 nova_compute[182934]: 2026-01-31 06:58:35.694 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:36 np0005603500 podman[225314]: 2026-01-31 06:58:36.137508377 +0000 UTC m=+0.051953326 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1769056855, build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., version=9.7, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 31 01:58:36 np0005603500 podman[225313]: 2026-01-31 06:58:36.164273421 +0000 UTC m=+0.082732257 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 31 01:58:39 np0005603500 nova_compute[182934]: 2026-01-31 06:58:39.646 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:40 np0005603500 nova_compute[182934]: 2026-01-31 06:58:40.695 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:44 np0005603500 podman[225362]: 2026-01-31 06:58:44.130982062 +0000 UTC m=+0.050403627 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, container_name=ceilometer_agent_compute)
Jan 31 01:58:44 np0005603500 podman[225363]: 2026-01-31 06:58:44.161858514 +0000 UTC m=+0.078440430 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 31 01:58:44 np0005603500 nova_compute[182934]: 2026-01-31 06:58:44.647 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:45 np0005603500 nova_compute[182934]: 2026-01-31 06:58:45.697 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:49 np0005603500 nova_compute[182934]: 2026-01-31 06:58:49.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:58:49 np0005603500 nova_compute[182934]: 2026-01-31 06:58:49.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:58:49 np0005603500 nova_compute[182934]: 2026-01-31 06:58:49.648 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:49 np0005603500 nova_compute[182934]: 2026-01-31 06:58:49.666 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:58:49 np0005603500 nova_compute[182934]: 2026-01-31 06:58:49.666 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:58:49 np0005603500 nova_compute[182934]: 2026-01-31 06:58:49.666 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:58:49 np0005603500 nova_compute[182934]: 2026-01-31 06:58:49.666 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:58:49 np0005603500 nova_compute[182934]: 2026-01-31 06:58:49.783 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:58:49 np0005603500 nova_compute[182934]: 2026-01-31 06:58:49.784 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5753MB free_disk=73.20862579345703GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:58:49 np0005603500 nova_compute[182934]: 2026-01-31 06:58:49.784 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:58:49 np0005603500 nova_compute[182934]: 2026-01-31 06:58:49.785 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:58:50 np0005603500 nova_compute[182934]: 2026-01-31 06:58:50.699 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:50 np0005603500 nova_compute[182934]: 2026-01-31 06:58:50.822 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:58:50 np0005603500 nova_compute[182934]: 2026-01-31 06:58:50.823 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:58:50 np0005603500 nova_compute[182934]: 2026-01-31 06:58:50.842 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:58:51 np0005603500 nova_compute[182934]: 2026-01-31 06:58:51.348 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:58:51 np0005603500 nova_compute[182934]: 2026-01-31 06:58:51.350 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:58:51 np0005603500 nova_compute[182934]: 2026-01-31 06:58:51.350 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.565s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:58:52 np0005603500 nova_compute[182934]: 2026-01-31 06:58:52.350 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:58:53 np0005603500 nova_compute[182934]: 2026-01-31 06:58:53.143 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:58:54 np0005603500 nova_compute[182934]: 2026-01-31 06:58:54.148 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:58:54 np0005603500 nova_compute[182934]: 2026-01-31 06:58:54.650 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:55 np0005603500 nova_compute[182934]: 2026-01-31 06:58:55.701 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:58:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:58:56.799 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:58:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:58:56.799 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:58:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:58:56.799 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:58:58 np0005603500 podman[225404]: 2026-01-31 06:58:58.12581231 +0000 UTC m=+0.047168915 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 31 01:58:58 np0005603500 nova_compute[182934]: 2026-01-31 06:58:58.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:58:58 np0005603500 podman[225405]: 2026-01-31 06:58:58.154492864 +0000 UTC m=+0.073575508 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 31 01:58:59 np0005603500 nova_compute[182934]: 2026-01-31 06:58:59.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:58:59 np0005603500 nova_compute[182934]: 2026-01-31 06:58:59.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:58:59 np0005603500 nova_compute[182934]: 2026-01-31 06:58:59.147 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 01:58:59 np0005603500 nova_compute[182934]: 2026-01-31 06:58:59.652 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:00 np0005603500 nova_compute[182934]: 2026-01-31 06:59:00.702 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:04 np0005603500 nova_compute[182934]: 2026-01-31 06:59:04.654 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:05 np0005603500 nova_compute[182934]: 2026-01-31 06:59:05.703 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:07 np0005603500 podman[225447]: 2026-01-31 06:59:07.144969715 +0000 UTC m=+0.055971193 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, release=1769056855, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 31 01:59:07 np0005603500 podman[225446]: 2026-01-31 06:59:07.159078109 +0000 UTC m=+0.076012554 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 01:59:09 np0005603500 nova_compute[182934]: 2026-01-31 06:59:09.657 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:10 np0005603500 nova_compute[182934]: 2026-01-31 06:59:10.704 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:14 np0005603500 nova_compute[182934]: 2026-01-31 06:59:14.660 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:15 np0005603500 podman[225489]: 2026-01-31 06:59:15.133000605 +0000 UTC m=+0.052971738 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 31 01:59:15 np0005603500 podman[225490]: 2026-01-31 06:59:15.138460297 +0000 UTC m=+0.055421295 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 31 01:59:15 np0005603500 nova_compute[182934]: 2026-01-31 06:59:15.714 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:19 np0005603500 nova_compute[182934]: 2026-01-31 06:59:19.661 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:20 np0005603500 nova_compute[182934]: 2026-01-31 06:59:20.716 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:24 np0005603500 nova_compute[182934]: 2026-01-31 06:59:24.663 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:25 np0005603500 nova_compute[182934]: 2026-01-31 06:59:25.719 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:29 np0005603500 podman[225534]: 2026-01-31 06:59:29.12640239 +0000 UTC m=+0.045185583 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 31 01:59:29 np0005603500 podman[225535]: 2026-01-31 06:59:29.152515863 +0000 UTC m=+0.066272028 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0)
Jan 31 01:59:29 np0005603500 nova_compute[182934]: 2026-01-31 06:59:29.665 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:30 np0005603500 nova_compute[182934]: 2026-01-31 06:59:30.721 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:34 np0005603500 nova_compute[182934]: 2026-01-31 06:59:34.667 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:35 np0005603500 nova_compute[182934]: 2026-01-31 06:59:35.723 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:38 np0005603500 podman[225573]: 2026-01-31 06:59:38.136443139 +0000 UTC m=+0.050565123 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, vcs-type=git, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z)
Jan 31 01:59:38 np0005603500 podman[225572]: 2026-01-31 06:59:38.158954477 +0000 UTC m=+0.077432278 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 01:59:39 np0005603500 nova_compute[182934]: 2026-01-31 06:59:39.670 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:40 np0005603500 nova_compute[182934]: 2026-01-31 06:59:40.725 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:44 np0005603500 nova_compute[182934]: 2026-01-31 06:59:44.672 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:45 np0005603500 nova_compute[182934]: 2026-01-31 06:59:45.727 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:46 np0005603500 podman[225617]: 2026-01-31 06:59:46.137361706 +0000 UTC m=+0.055264291 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_id=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute)
Jan 31 01:59:46 np0005603500 podman[225618]: 2026-01-31 06:59:46.151248133 +0000 UTC m=+0.067472125 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 31 01:59:49 np0005603500 nova_compute[182934]: 2026-01-31 06:59:49.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:59:49 np0005603500 nova_compute[182934]: 2026-01-31 06:59:49.673 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:59:49 np0005603500 nova_compute[182934]: 2026-01-31 06:59:49.674 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:59:49 np0005603500 nova_compute[182934]: 2026-01-31 06:59:49.674 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:59:49 np0005603500 nova_compute[182934]: 2026-01-31 06:59:49.674 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Jan 31 01:59:49 np0005603500 nova_compute[182934]: 2026-01-31 06:59:49.674 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:49 np0005603500 nova_compute[182934]: 2026-01-31 06:59:49.801 182938 WARNING nova.virt.libvirt.driver [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 31 01:59:49 np0005603500 nova_compute[182934]: 2026-01-31 06:59:49.802 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5756MB free_disk=73.20864486694336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Jan 31 01:59:49 np0005603500 nova_compute[182934]: 2026-01-31 06:59:49.802 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:59:49 np0005603500 nova_compute[182934]: 2026-01-31 06:59:49.803 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:59:50 np0005603500 nova_compute[182934]: 2026-01-31 06:59:50.728 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:50 np0005603500 nova_compute[182934]: 2026-01-31 06:59:50.844 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Jan 31 01:59:50 np0005603500 nova_compute[182934]: 2026-01-31 06:59:50.844 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Jan 31 01:59:50 np0005603500 nova_compute[182934]: 2026-01-31 06:59:50.868 182938 DEBUG nova.compute.provider_tree [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed in ProviderTree for provider: b70e363b-8d1d-4e70-9fa4-9b0009536a59 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 31 01:59:51 np0005603500 nova_compute[182934]: 2026-01-31 06:59:51.381 182938 DEBUG nova.scheduler.client.report [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Inventory has not changed for provider b70e363b-8d1d-4e70-9fa4-9b0009536a59 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Jan 31 01:59:51 np0005603500 nova_compute[182934]: 2026-01-31 06:59:51.382 182938 DEBUG nova.compute.resource_tracker [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Jan 31 01:59:51 np0005603500 nova_compute[182934]: 2026-01-31 06:59:51.383 182938 DEBUG oslo_concurrency.lockutils [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.580s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:59:52 np0005603500 nova_compute[182934]: 2026-01-31 06:59:52.382 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:59:52 np0005603500 nova_compute[182934]: 2026-01-31 06:59:52.383 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:59:54 np0005603500 nova_compute[182934]: 2026-01-31 06:59:54.142 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:59:54 np0005603500 nova_compute[182934]: 2026-01-31 06:59:54.674 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:55 np0005603500 nova_compute[182934]: 2026-01-31 06:59:55.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:59:55 np0005603500 nova_compute[182934]: 2026-01-31 06:59:55.730 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 01:59:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:59:56.846 104644 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Jan 31 01:59:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:59:56.846 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Jan 31 01:59:56 np0005603500 ovn_metadata_agent[104639]: 2026-01-31 06:59:56.846 104644 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Jan 31 01:59:59 np0005603500 nova_compute[182934]: 2026-01-31 06:59:59.143 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 01:59:59 np0005603500 nova_compute[182934]: 2026-01-31 06:59:59.676 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 02:00:00 np0005603500 podman[225661]: 2026-01-31 07:00:00.130001208 +0000 UTC m=+0.049854441 container health_status 043672bdd4133d397ae48a29dcba0374cd9939a0f3557a7eaadb846e80a9efeb (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 31 02:00:00 np0005603500 nova_compute[182934]: 2026-01-31 07:00:00.146 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 02:00:00 np0005603500 nova_compute[182934]: 2026-01-31 07:00:00.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 02:00:00 np0005603500 nova_compute[182934]: 2026-01-31 07:00:00.147 182938 DEBUG oslo_service.periodic_task [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 31 02:00:00 np0005603500 nova_compute[182934]: 2026-01-31 07:00:00.147 182938 DEBUG nova.compute.manager [None req-35a071db-9d83-48db-a46c-6262330b7056 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Jan 31 02:00:00 np0005603500 podman[225662]: 2026-01-31 07:00:00.151316659 +0000 UTC m=+0.067911710 container health_status b53cff4214e7b55aea74e24f050c04ec1f05a2f51642ba001407f6bb013a4227 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 31 02:00:00 np0005603500 nova_compute[182934]: 2026-01-31 07:00:00.731 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 02:00:04 np0005603500 nova_compute[182934]: 2026-01-31 07:00:04.678 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 02:00:05 np0005603500 nova_compute[182934]: 2026-01-31 07:00:05.732 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 02:00:09 np0005603500 podman[225706]: 2026-01-31 07:00:09.124587789 +0000 UTC m=+0.047166716 container health_status bc3e33a56dcad28ca2e26dfe9c3cf941c130987acaa9c3d855609fbd2119cb8b (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.7, architecture=x86_64, config_id=openstack_network_exporter)
Jan 31 02:00:09 np0005603500 podman[225705]: 2026-01-31 07:00:09.147433199 +0000 UTC m=+0.069707887 container health_status 072b5c36d9a254e802e08784999ad274dd0f29073260c56994901a2759bd8c4d (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 31 02:00:09 np0005603500 nova_compute[182934]: 2026-01-31 07:00:09.678 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 02:00:10 np0005603500 nova_compute[182934]: 2026-01-31 07:00:10.735 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 02:00:11 np0005603500 systemd-logind[821]: New session 29 of user zuul.
Jan 31 02:00:11 np0005603500 systemd[1]: Started Session 29 of User zuul.
Jan 31 02:00:14 np0005603500 nova_compute[182934]: 2026-01-31 07:00:14.679 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 02:00:15 np0005603500 nova_compute[182934]: 2026-01-31 07:00:15.736 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 02:00:15 np0005603500 ovs-vsctl[225926]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 31 02:00:16 np0005603500 virtqemud[183236]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 31 02:00:16 np0005603500 virtqemud[183236]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 31 02:00:16 np0005603500 virtqemud[183236]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 31 02:00:16 np0005603500 podman[226130]: 2026-01-31 07:00:16.861725841 +0000 UTC m=+0.064027707 container health_status d21698b9274565d105426ee57f7deb7681c0e20c7c3ddbf3f0eb8e4d7ce04c0a (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 31 02:00:16 np0005603500 podman[226121]: 2026-01-31 07:00:16.882834705 +0000 UTC m=+0.085082569 container health_status 0019ca32e6f969fce777051ba2cdcfe3d80076b35a41261d74821ba26855998d (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, tcib_build_tag=d790bc5e0de33b4fa3f6e15acfa448e0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fb5c85efb4321c385ca89826d7d61c44ebde14c02a6565a6bb6dbc3881561fcd-5f7c79afb943f241c60c116357598c005f60f96406d4e9e55225c6527ff53990-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:d790bc5e0de33b4fa3f6e15acfa448e0', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.988 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f199f43b3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f199f44d040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.989 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f199f44d220>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.989 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f199f451250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f199f44d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f199f44d6a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f199f43b490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.990 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.990 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f199f43b550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f199f44dcd0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f199f44dc10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f199f44d160>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.991 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f199f44d2e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.991 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.992 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f199f44d3d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.992 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.992 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f19a53f3b80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.992 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.992 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f199f43bca0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.992 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.992 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f199f44d940>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.992 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.992 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f199f44db50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.992 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.993 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f199f43b340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.993 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.993 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f199f43b700>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.993 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.993 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f199f43bbe0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.993 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.993 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f199f43b0d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.993 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.993 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f199f436bb0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.993 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.994 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f199f43baf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.994 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.994 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f199f44d4f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.994 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.994 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f199f44d760>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.994 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.994 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f199f43bdf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f19a048b760>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Jan 31 02:00:17 np0005603500 ceilometer_agent_compute[192681]: 2026-01-31 07:00:17.994 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Jan 31 02:00:19 np0005603500 systemd[1]: Starting Hostname Service...
Jan 31 02:00:19 np0005603500 systemd[1]: Started Hostname Service.
Jan 31 02:00:19 np0005603500 nova_compute[182934]: 2026-01-31 07:00:19.681 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 31 02:00:20 np0005603500 nova_compute[182934]: 2026-01-31 07:00:20.738 182938 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
